How Should SEO’s Approach Twitter’s New Homepage, Real-Time Search Engine
July 30, 2009
Twitter’s new homepage seems to be trying to position them as a search engine solution for “real time information”.
It looks like you can rank so long as your post is very recent. A search on “keyword xyz” (removed for privacy) which is our #1 keyword for Clientwebsite.com shows roughly 5 to 10 tweets per day. If we were to post a Tweet 2-3 times per day with “keyword xyz” somewhere in it, we would pretty much always be in the top few results all the time. I know there are Tweet Scheduling applications that could automate this, but that feels reasonably spammy to me and not something I’d want to do with a client’s main profile. Not really something I think we should do at all. Now for higher-volume keywords the frequency would need to be very high, for lesser-volume and lesser-competition keywords you could maybe post once a day or even just a few times per week and stay in the top few results. This will probably change quite a bit as Twitter grows, etc.
The algo doesn’t look very sophisticated – it doesn’t seem to give any weight to how trusted (popular) as user is. In fact it doesn’t even seem to give weight to keyword proximity. I searched on “network support” and the top 3 posts have both words “network” and “support” but not the phrase. They are poor results – not talking about “network support”. The fourth result has that phrase and is a good result. The only determining factor in ranking here seems to be how recently the Tweet was posted. So long as you have the words included you are in the game.
Additional verification – I just posted a question about an HDTV and I’m now ranked #3 for that search on Twitter’s homepage. The two above me were posted more recently (I posted 7 minutes ago, they posted 3 and 4 minutes ago).
I can’t help but feel that all this is going to do is encourage a high volume of spammy Tweets from people who want to stay at the top of Twitter search results and know that frequency is really the only requirement? Hopefully they add in some things to account for spam versus trusted users, etc.
Its clear the spammers are already hitting this hard. How should higher-end SEO people and new media consultants approach this?