I’ve decided to shut down my experimental SEO keyword graphing tool to focus on other activities. Sorry if you came here looking for it but it is no more. I wish I could suggest a good alternative to you but AFAIK there still isn’t a tool that can take a keyword, expand it and then produce a graph showing you all the related terms and how you should think about planning your web site pages.
If anyone wants to buy the domain seokeywordsearch.com let me know before I let it go as it will probably get snapped up by some domain squatter.
Anyone buying a domain who is concerned about competitors or domain name squatters sneaking in should consider:
1. The homophones (i.e. same pronounciation) for the prefix (if any), root and suffix (if any) of the domain name (e.g. fone and phone)
2. The plural form or singular form for nouns (e.g. -tune and -tunes)
3. Other tenses of verbs (e.g. -stack and -stacking (gerund) or -stacked (past participle))
3. The hyphenated form for compound words
I’ve paid 3x for a plural version of a domain that someone grabbed after I’d just bought the singular form. Unfortunately consumers tended towards the plural form when recalling the name which made the purchase necessary. That’s another strategy domain squatters may use against you when you purchase a domain name – they see the registration change, know that the name is worth something to someone and so they immediately grab the plural form.
A good domain name finding tool would include phonetic and morphological tools to help you find domains where all relevant forms are available (startup idea for someone).
In a previous post I provided a utility called LinkChecker that is a web site crawler and link checker. The idea behind LinkChecker is that you can include it in your continuous integration scripts and thus check your web site either regularly or after every deployment and unlike a simple ping check this one will fail if you’ve broken any links within your site or have seo issues. It will also break just once for every site change and then be fixed the next time you run it. This feature means that in a continuous integration system like TeamCity you can get an email or other alert each time your site (or perhaps your competitor’s site) changes.
As promised in that post, a new version is now available. There’s many improvements under the covers but one obvious new feature is the ability to dump all the text content of a site into a text file. Simply append -dump filename.txt to the command line and you’ll get a complete text dump of any site. The dump includes page titles and all visible text on the page (it excludes embedded script and css automatically). It also excludes any element with an ID or CLASS that includes one of the words “footer”, “header”, “sidebar”, “feedback” so you don’t get lots of duplicate header and footer information in the dump. I plan to make this more extensible in future to allow other words to be added to the ignore list.
One technique you can use with this new ‘dump’ option is to dump a copy of your site after each deployment and then check it into source control. Now if there’s every any need to go back to see when a particular word or paragraph was changed on your site you have a complete record. You could for example use this to maintain a text copy of your WordPress blog, or perhaps to keep an eye on someone else’s blog or Facebook page to see when they added or removed a particular story.
Download the new version here:- LinkCheck <-- Requires Windows XP or later with .NET4 installed, unzip and run
Please consult the original article for more information.
LinkCheck is free, it doesn’t make any call backs, doesn’t use any personal data, use at your own risk. If you like it please make a link to this blog from your own blog or post a link to Twitter, thanks!
First there was Continuous Integration, then there was Continuous Deployment, now there’s Continuous Testing.
Testing can (and should) be integrated throughout your web site development process: automated unit-testing on developer’s machines, automated unit testing during the continuous integration builds and then further automated testing after your continuous deployment process has deployed the site to a server.
Sadly, once deployed, most sites get only a cursory test through a service like Monastic that pings one or more URLs on your site to check that the site is still alive.
BUT, how do you know if your site is still working from a user’s perspective or from an SEO perspective? Serious bugs can creep in from seemingly small changes that aren’t in code but are in the markup to a site, these are often not tested by any of the aforementioned tests. For example, a designer editing HTML markup for your site could accidentally break the sign up link off the main entry page, or the page you had carefully crafted to be SEO optimized around a specific set of keywords could accidentally lose one of those words and thus loses rank in search engines causing your traffic to go down. Would you even know if this has happened?
Based on a small test I ran on some local startup web sites, the answer appears to be ‘no’. These sites often had broken links and poorly crafted titles (from an SEO perspective). Of course they could have used any of the many SEO services that can check your site to see if it has broken links or poorly crafted titles and descriptions (e.g. seomoz.com), but that’s often a manual process and there’s no way to link such tests into your existing continuous integration process.
What would be nice would be if you could include a ‘Continuous Link and SEO test’ on your Continuous Integration Server. This test could be triggered after each deployment and it could also run as a scheduled task, say every hour, to check that your web site is up and that all public pages are behaving correctly from a links and SEO perspective. It would also be nice if there was some way to get a quick report after each deployment confirming what actually changed on the site: pages added, pages removed, links added, links removed.
This is what my latest utility ‘LinkCheck2′ does. It’s a Window command line application that produces a report, and it will set an error code if it finds anything amiss. You can run it from the command line for a one off report or call it from your continuous integration server. The error code can be used by most CI servers to send you an alert. If you are using the changes feature you’ll get an alert when something changes and then on the next run it will automatically clear.
LinkCheck2 also includes the ability to define a ‘link contract’ on your site. This is a meta tag you add to a page to say ‘this page must link to these other pages’. LinkCheck2 will verify that this contract has been met and that none of your critical site links have been dropped by accident when someone was editing the markup.
At the moment LinkCheck2 checks all links and performs a small number of SEO tests (mostly around the length of titles). If there is interest in this tool I may expand the SEO capabilities, please send me your feedback and requests.
Use of LinkChecker.exe is subject to a license agreement: in a nutshell: commercial use is permitted, redistribution is not. Please contact me for details.
One thing that struck me while I was building the Seo Keyword Search and Mapping tool was that keyword analysis really can reveal a lot about what people are thinking about and what they are looking for. Seo keyword analysis really is the largest scale unprompted recall survey you can possibly do. You can apply it to almost any industry and it gives you a detailed picture as to what customers of that industry really want.
Of course, prior to the launch of the Seo Keyword Search and Mapping tool it was pretty hard to actually see that picture but now it’s quite easy.
So what can you use this new found understanding of your customers to do?
Well, obviously you can create AdWord campaigns around it – that’s why Google provides the information to you in the first place. And clearly you can craft better landing pages with more keyword friendly titles, headings, body content and images on the page. But why stop there? Here’s four more things you can do with the information:
1. Plan your next blog post using it – make sure you’ve covered all the topics people are asking about for your industry
2. Create a digital sign using the keywords that are most interesting to people who visit your retail locations
3. Use them in Twitter campaigns
4. Use them to craft better direct marketing messages whether email or postal
There’s a post over on Search Engine Land that has a good list of SEO Myths and things to avoid: http://searchengineland.com/36-seo-myths-that-wont-die-but-need-to-40076
There are however two points with which I take issue:
“14. It’s important for your rankings that you update your home page frequently (e.g. daily.) This is another fallacy spread by the same aforementioned fellow panelist. Plenty of stale home pages rank just fine, thank you very much.”
Yes there are stale pages that rank highly but that doesn’t mean updating your site regularly isn’t going to boost your search engine rankings; the stale pages may be at the top of their category for other reasons. If you aren’t #1 on the keywords you care about then I highly recommend adding fresh, relevant content to your site on a regular basis. I’ve observed competing sites leap up from 2 to 10 spaces in the rankings when they updated their site and then gradually drop back over the following days or weeks. Google does care about recency so this isn’t such good advice as written.
36. Great Content = Great Rankings. Just like great policies equals successful politicians, right?
I’m not sure what the author means by this point. Content and in-bound links form the backbone of SEO efforts. So don’t stop adding great content to your site!
In a previous post I explained some of the issues with ASP.NET MVC when trying to implement an SEO-optimized web site. In this post I’ll begin to explore some possible solutions.
Step 1: Master View – some additions
First let’s make it easy to set the
meta keywords and
canonical url by adding the following to the
head section of the master view:
<head id="Head1" runat="server"> <title><%=ViewData["PageTitle"]%></title> <%-- This gets wrapped here, so it sees a title tag and doesn't emit two --%> <%=ViewData["PageDescription"]%> <%-- These are wrapped elsewhere so they vanish if not set--%> <%=ViewData["PageKeywords"]%> <%-- These are wrapped elsewhere so they vanish if not set--%> <%=ViewData["CanonicalUrl"]%> <%-- These are wrapped elsewhere so they vanish if not set--%> <meta name="robots" content="noodp" /> <%--Don't use Open Directory Project descriptions--%>
Note how we are not wrapping the canonical URL, and meta tags around the ViewData here (even though it would be more correct to do so). We do this so that when those tags are not present the entire tag disappears from the page instead of rendering a tag with an empty string in it. For the title tag however, that’s universal so let’s do it ‘properly’.
In tomorrow’s post I’ll show how we can set the Canonical URL using an attribute.
Being an engineer at heart I like to measure things and to take them apart to see how they work. Search engines however are somewhat opaque – they use hundreds of different algorithms to create Search Engine Ranking Pages (SERPS) and those algorithms are a closely guarded secret and the subject of much debate and investigation in the SEO community.
So, since I wanted to learn more about SEO I decided to track and graph the top 100 ranked entries for a particular keyword over several weeks and here is the result. Click to enlarge it.
(i) volatility is clearly a lot less in the top ranked sites – moving one position from 4 to 3 is going to be a lot harder than moving from 94 to 93.
(ii) some sites can have stunning leaps for a short time but then get reset back to their former position – presumably getting the ‘recent content’ lift but then failing to capitalize on it with more frequent updates.
(iii) search engine optimization (SEO) is not something you can do once and forget, it’s something you need to stay on top of with frequent updates to your site and constant in-bound link building efforts.
I plan to compare this chart against some other keywords I’ve been tracking to see how they compare. I also want to track each site to determine what caused the particularly stunning leaps or falls in the rankings and learn from it. Maybe a chart like this could help identify which keywords are more volatile than others and therefore which ones you can make most progress against.
It’s early days, I just generated the first chart a few hours ago, so stay tuned for updates on this project.