Day 2
Track 4: Successful Site Architecture
Panel Members:
Mathew Bailey of Site Logic Marketing and Derrick Wheeler of Acxiom Digital
What exactly is successful site architecture?
Success = Uniqueness, quality content, & website architecture used to make a site user friendly!
Be sure to use links to discover URLS, determine relevance, and understand the importance of a page.
Remember! Search engines use text links to crawl a web site! Use anchor text that will tell both your visitor and the search engines what the next page will bring. You probably want to insert your keywords there if you can! Get creative. (90% of the time you really can come up with a way!)
Java is not conducive to optimizing because there is no URL path. Be sure to use alt text to “validate” a linking button.
Let’s address duplicate content quickly.
Don’t enable users to view their breadcrumbs (or click path) off of parameters in the URL. This creates a duplicate content issue for search engines. The solution: Assign one page with one set of breadcrumbs, whether or not that was the path they took; or allow your breadcrumb links to implement dynamically according to a cookie.
Angie is mentioning that we should implement “The quickest path to get here is” ahead of your predetermined breadcrumb links.
My question still?…Your users want to know where they are in your site and how they got there. If you don’t choose the cookie option, won’t you be dis-servicing them by not telling them the true way they came?
I suppose the static content would be better for SEO purposes. Then you are eliminating some of their potential traffic driving keywords. I suppose it would be up to the owner to select which keywords would provide the most conversions.
Now that I think of it, would the search engines think the dynamically imported content from the cookie was duplicate content? Probably. Hmmmm…
Don’t forget about 301 redirects, or permanent redirects/moves. Can you 301 instances like duplicate content breadcrumbs? Maybe not but the instances can generate URLS which then 301 to the new standard or selected page you do want indexed (the one with the preferred converting, top producing content).
Any thoughts?
Here are some more situations of duplicate content as the integrate with site architecture…
No session IDs. They create duplicate content too.
If you use a relative link (…/folder/pagename.htm) instead of an absolute link (www.domainnamehere.com/folder/pagename.htm) when linking your pages to others, you may wind up causing duplicate content issues when a search engine when you have a secure web site. Links on secure pages will be assumed to be secure pages when using relative links instead of absolute. Now you have 2 sites. So use relative links with https.
Quick Fact: Website submission is as 90’s as scrunchies, LA Gear and NKOTB.
Quick Tip #1: Move java script to external files.
Quick Tip #2: Validation confirms code is valid and assures spiders can index your content.
Quick Tip #3: Google Webmaster Central allows you access to a diagnostic page with a crawl summary of how they go through pages.
Still have remaining questions? Want more?
Check out…Mathew Bailey’s Website Architecture.
Or
Derrick Wheeler’s Successful Site Architecture.
Posted By: Angela Collins
Cal Coast Web Design