Phone: (888) 281-7665 >> Email: info@calcoastwebdesign.com

News & Resources

/

SES 2007 – Successful Site Architecture

  • By calcoastwebdesign
  • 1 Tags
  • 3 Comments
  • 22 Aug 2007

Day 2
Track 4: Successful Site Architecture

Panel Members:
Mathew Bailey of Site Logic Marketing and Derrick Wheeler of Acxiom Digital

What exactly is successful site architecture?

Success = Uniqueness, quality content, & website architecture used to make a site user friendly!

Be sure to use links to discover URLS, determine relevance, and understand the importance of a page.

Remember! Search engines use text links to crawl a web site! Use anchor text that will tell both your visitor and the search engines what the next page will bring. You probably want to insert your keywords there if you can! Get creative. (90% of the time you really can come up with a way!)

Java is not conducive to optimizing because there is no URL path. Be sure to use alt text to “validate” a linking button.

Let’s address duplicate content quickly.

Don’t enable users to view their breadcrumbs (or click path) off of parameters in the URL. This creates a duplicate content issue for search engines. The solution: Assign one page with one set of breadcrumbs, whether or not that was the path they took; or allow your breadcrumb links to implement dynamically according to a cookie.

Angie is mentioning that we should implement “The quickest path to get here is” ahead of your predetermined breadcrumb links.

My question still?…Your users want to know where they are in your site and how they got there. If you don’t choose the cookie option, won’t you be dis-servicing them by not telling them the true way they came?

I suppose the static content would be better for SEO purposes. Then you are eliminating some of their potential traffic driving keywords. I suppose it would be up to the owner to select which keywords would provide the most conversions.

Now that I think of it, would the search engines think the dynamically imported content from the cookie was duplicate content? Probably. Hmmmm…

Don’t forget about 301 redirects, or permanent redirects/moves. Can you 301 instances like duplicate content breadcrumbs? Maybe not but the instances can generate URLS which then 301 to the new standard or selected page you do want indexed (the one with the preferred converting, top producing content).

Any thoughts?

Here are some more situations of duplicate content as the integrate with site architecture…

No session IDs. They create duplicate content too.
If you use a relative link (…/folder/pagename.htm) instead of an absolute link (www.domainnamehere.com/folder/pagename.htm) when linking your pages to others, you may wind up causing duplicate content issues when a search engine when you have a secure web site. Links on secure pages will be assumed to be secure pages when using relative links instead of absolute. Now you have 2 sites. So use relative links with https.

Quick Fact: Website submission is as 90’s as scrunchies, LA Gear and NKOTB.

Quick Tip #1: Move java script to external files.

Quick Tip #2: Validation confirms code is valid and assures spiders can index your content.

Quick Tip #3: Google Webmaster Central allows you access to a diagnostic page with a crawl summary of how they go through pages.

Still have remaining questions? Want more?

Check out…Mathew Bailey’s Website Architecture.
Or
Derrick Wheeler’s Successful Site Architecture.

Posted By: Angela Collins
Cal Coast Web Design

COMMENTS
I think this session and the usability session should have been back to back.

Not too much new here for me to comment on, we have been practicing these concepts for years.

The breadcrumbs threw me for a loop though, didn't think about that one.

Also, a warning brought up in one of my other classes on duplicate content: add no follow tags on "printable" or "pdf" page versions. That's new... search engines couldn't always read the .pdf or .doc filetype, but now that they can it could be tagged duplicate content.
I like the “Success = Uniqueness, quality content, & website architecture used to make a site user friendly!” however it’s so hard to be unique with your website anymore. There are countless websites that deal with the same things so I agree that beyond being unique you have to have quality content, and an easy to use, attractive website.

Hmmm I didn’t know that duplicate content could come from “breadcrumbs” in a URL, well I never gave it any thought but now I see! And yes, you still need to give the visitor a way to determine where they came from so this seems a bit tricky. Your comment about No session IDs is another example of something I have never given any thought to and didn’t know that absolute links play a more helpful role than relative links.
I always assumed duplicate content could only come from positng duplicate content - not from breadcrumbs. I think Scott is right here too - its hard to be unique with so many compettitve sites out there - but you can do it. Find out what your competitors are doing and do something different. While much of the contnet may be simmilar, the way you present it makes a huge difference in who visitors will choose.