A Day in the Life of SES London 2008
This YouMoz entry was submitted by one of our community members. The author’s views are entirely their own (excluding an unlikely case of hypnosis) and may not reflect the views of Moz.
Following on from the excellent posts by Ciarán on SES London 2008, here are my experiences of the event.
I only attended the final day of the conference, as it seemed to have the most relevant agenda for the areas I'm currently looking at.
I was a little disappointed when I arrived as the exhibition was run only on the first two days of the conference. I can understand why it isn't practical to run the expo for three days, but it was a shame to not have it available for all attendees. At least I got a copy of Nicolas Carr’s book, "The Big Switch," which I started reading on the train home.
Meet the Crawlers
This first session was a chance to meet representatives from the major search engines, with speakers from Google, Ask, and Microsoft.
Luisella Mazza from Google was first up and showed the benefits of Google Webmaster Central. There is no denying that there are great tools and resources available from Google, but I would have thought other areas would have been more interesting to talk about. A bit of a missed opportunity, I think.
Tom Alby from Ask.com was quick to remind us that Ask is the 4th biggest search engine in the UK and then went on to describe ways to help the crawlers do their job, including correct use of robots.txt, using compression on pages, and Unicode encoding. In addition, he recommended using Lynx to view your site as the crawler does. Another tip was to avoid using links in image maps, as they can be difficult to crawl properly.
Last up was Paul Stoddart from Microsoft, who immediately made it clear that he wasn't the right person to present informatively about the areas of interest to the audience (usually Nathan Buggia would do this sort of thing). His presentation, written overnight, was pretty much a mirror of the Google presentation but showing the Webmaster Live tools. At least it was fairly light-hearted, with Paul poking fun at how "excited" he was that Live now supports compression and "conditional get." He was also open in saying Microsoft were a bit behind the curve with tools to support webmasters, but were getting there. He invited the audience to use the feedback mechanisms to "hurl abuse" (his words) at Nathan and his team until they got it right.
Probably the session could have given more time for Q&A, as I found this to be the most useful part.
After the session I had a quick chat with Maile Ohye, a developer for Google Webmaster Tools. I managed to get her to promise to look at an issue I've been having downloading the full "top keywords" file for a site I'm working on. It was great to be able to chat to someone in person, as I have had no luck using the forums to get it sorted. Let's hope she gets back to me with some thoughts!
I also asked Luisella about setting a geographic target when a site targets two or more countries. A site I'm working on uses geo-location to customise parts of each page to provide relevant information for their country. I was interested to know the impact of setting the target to the US, as this is where 70% of traffic to the site was coming from. At the moment the site is doing better on certain keywords in "UK only searches" on google.co.uk, because the site is hosted in the UK.
The advice here was that it might be worth trying setting the target to US for a while, as it was the highest traffic region, and monitoring the effects. She made a point that there will be more competition for keywords in the US than in the UK, so it may not increase ranks that much. If this approach didn't yield better results in the US, or if there is a significant drop in the UK, then it would be easy to go back to specifying no geographic target and things would return to the current state of play.
Dynamic Websites – Beyond the Basics
To add to what Ciarán had to say about this session, there were many great points here about dynamic websites. Whilst dynamic sites are more likely to have issues with indexing, they can be easier to optimise to outrank static sites. Some strategies to help with dynamic sites are to simplify the technology where possible, consider using static replication, look at URL re-writes, and build a data-bridge.
Some other general tips were:
- Extensions do not matter (.php, .jsp, .aspx etc) – experiments prove made-up extensions still get indexed.
- Keep an eye out for limitless calendars – they can be a spider-trap for crawlers.
- Stay away from form-based navigation.
- Remove HTML comments from your pages.
- Don’t output empty divs/tables – make sure your code properly handles when data is optional.
Beyond Linkbait: Getting Authoritative Online Mentions
This session was presented by Alan Webb, Mikkel deMib Svendsen, and Brian Turner. The session started off by describing the "chicken and egg" situation you have with getting links to your site (i.e., you need links to rank well but you won’t get links if no one knows about you). To solve this, you need to kick start getting links to your site. Some options include:
- SMO - Social Media Optimisation (e.g., use of social bookmarks)
- Usenet newsgroups – get the word out there
- Newsletter – use your own newsletter to promote new features
- Blogs – report on news, make controversial postings
- Forums – post relevant links to your site (don’t spam!!)
- Keyword advertising around new features
- Press releases – e.g., OpenPR
- Email your users
Some ideas for Linkbait given during the presentations include:
- Create a tool that has some unique feature in your field
- Set up a fun contest with an unusual prize (don’t always expect every contest to work – keep trying)
- Create an online game – something fun and addictive
- Be the first to post breaking news
- Create lists, e.g., Top 10 – people love lists!
- Use controversy but be careful about it backfiring (negative feedback can hurt your domain)
- Freebies – desktop wallpapers, gadgets, etc
- Tutorial/white papers about your field
- Create pages with a "cool" design (they used an example of http://producten.hema.nl – cool but probably useless from a usability perspective)
The best day of the year to launch Linkbait was stated to be April 1st – you can get away with anything on this day! Remember when Matt Cutts was leaving Google for Yahoo?
Finishing off the session was a reminder about how important links to your site are. A site can be ranked well for a keyword on links alone. Both the Tony Blair "Liar" Googlebomb and the Bush "Miserable Failure" examples were highlighted. A handy extension to the linkdomain: command on Yahoo was shown, which allows you to find out which links to your domain are associated with a particular keyword:
linkdomain: yoursite.com "some keywords" –site:yoursite.com
Here is an example of this in use for the "Miserable Failure" example:
linkdomain:www.whitehouse.gov "miserable failure" -site:www.whitehouse.gov
A final note was to watch out for Black Hat tricks that might be used on your site to get links to others. In particular, make sure cross-site scripting (XSS) cannot be used to exploit your site. Mikkel also told us about the fun he had sending fake referrer URLs to other SEOs' sites to send them in circles trying to work out why they were apparently ranking for undesirable keywords. Bless that orange jacket wearing man!
My SEM Toolbox
Like Ciarán, I also attended the tools session, which seemed very popular. Instead of duplicating all the links here, I will skip over this session.
Site Clinic
The final session of the day for me, Site Clinic, was hosted by Mikkel (that guy was everywhere I went!) and Bruce Clay. Hoping we might get to discuss some more advanced topics, I submitted a site for them to look at. Unfortunately, it wasn't picked up in the session, but it was still interesting to see some of the problems other sites are having. Most were quite simple, such as having the same title on most pages or using over-complex URLs.
The general approach taken for all sites reviewed was to use the site: command in Google to see how the site was being indexed. This easily showed how many pages were indexed, how many were in the supplemental results, and if there were potential URL problems or duplicate titles and descriptions. Bruce also used tools on his site to identify on-page issues such as improper use of header tags (H1/H2), multiple URLs to access the home page (e.g., mysite.com, mysite.com/Default.aspx, mysite.com/default.aspx), redirection issues, etc.
A good tip given by the panel that I hadn't thought of before was, immediately following a URL reorganisation to create SE-friendly URLs, submit the sitemap for all the old URLs so the search engine can find all the 301s from the previous URLs and update them in its index to the new URLs without having to discover them on its own.
Well that’s about it from me. Here’s looking forward to SMX London 2008. Hope to see some of you mozzers there!
Comments
Please keep your comments TAGFEE by following the community etiquette
Comments are closed. Got a burning question? Head to our Q&A section to start a new conversation.