A few weeks ago, Derek Halpern over at Social Triggers, wrote a great piece on persuading people you don’t know to help you, aka getting links.
In short, Derek used the incentives brought to light in the popular book Freakonomics – economic, social and moral – then translated them into ways your outreach emails can provide value and increase responses.
Similar to Derek, I’m a fan of the thought process behind the Freakonomics’ incentives and use them for all facets of link building.
One of the most effective areas of these incentives falls on link builders themselves.
Google Webmaster Central describes the site title, or title tag, as a quick representation of the content of a result and its relevance to a specific query.
Sounds basic right? However, one thing you may not know, or dealt with, is the fact that Google reserves the right to alter a title tag if they feel it isn’t the best representation of a page.
Over a year ago, Pierre Far – a Google Webmaster Trends Analyst – explained that algorithms will generate multiple alternative titles so that pages aren’t constrained to having the same static title tag for every search query.
The basic thought behind this to increase click-through ratios by displaying a “better,” more concise title tag or change the tag for semantic terms allowing users to easily recognize a relevant page. Yes, anyone who has ever written a PPC ad can tell you that a relevant, concise ad leads to an improved CTR, but what happens when the title isn’t a “better” choice for the user, or when it makes a site look incompetent?
It’s not breaking news to hear that search engines filter out duplicate or thin results. Yes, almost 12 percent of sites – a number directly from Google – were caught off guard with the first Panda algorithm; however, low-quality content isn’t what caught these sites by surprise.
What surprised webmasters was the fact that a gap started to close between what Google said happens and what actually happens.
This initial shock should have been expected. I mean, think about Google’s purpose – provide relevant content that people will want to read and share. If their search engine is promoting duplicate or thin results, then what are the odds that a user switches to a competitor? Very high.
Last July, Google added a new feature to Webmaster Tools – Index Status. This feature helped webmasters better understand their site’s indexing, including what I considered as the most important feature, showing the pages that the algorithm overlooked.
As of yesterday, it seems that this feature has been turned off. Not Selected was a popular tool amongst technical SEOs that provided the benefit of knowing the exact moment a duplicate content issue or indexing issue occurred.
At this point, there isn’t a test, certification or degree/major for SEO professionals, even though you may find that happening in the near future – especially with amazing programs like Distilled U. But, no matter how you fell into SEO, one thing is always true, which is the fact that there is massive amounts of information on the Internet pertaining to SEO and not all of it is correct.
To alleviate this, I’ve put together this list below containing a handy glossary, industry blogs, great articles (that will be updated frequently) and, most importantly, people you should be following on Twitter to keep up with the changing SEO landscape.
I’ve found that following great minds like those below is one of the best ways to sharpen your skill set, and you can learn quite a lot from these great men and women.
Writing exceptional web copy that gets recognition from users as well as search engines is a skill that you must learn, just like any othe SEO competency.
The goal of web copy is generally to provide information that provokes users to take a specific action – making copy as important to your conversions as any other part of your process.
To streamline your next project, here is a process and tips that will have you writing effective copy in no time.
On Tuesday, Facebook announced the third pillar of its social network – search.
This internally-focused social search engine, known as Graph Search, uses people’s likes, connections, location, experiences and pretty much anything shared on Facebook to provide what looks to be an effective local search, or product/service recommendation engine.
Evaluating link opportunities can be one of the largest gray areas in SEO. Sometimes we are 100 percent confident in our target, like Larry Kim of WordStream after nailing a coveted link from the Wall Street Journal; however, not every link is beneficial to a site’s profile and you want to be certain that you are not simply wasting your time or doing something that could potentially harm the quality of your site.
When searching for links, the best advice I’ve heard in a long time was from Julie Joyce of LinkFish Media. Julie said the best link builders are ones that can do their job without relying on a toolbar. After hearing this, it really hit home with me. More often than not, I heard from my team how they passed up on an opportunity because Open Site Explorer gave a site a Domain Authority of 42 and Page Authority of 37, and after thinking deeper, I can remember specific instances where posts I had on older sites gained massive traction as the site aged and grew.
So the next time you consider dropping a potential link, think of the long term, and use your eyes and brain before relying on a tool. Here are some of the specific things I look for before reaching into my tool box.
Yesterday at Pubcon Vegas, a search, social media and affiliate marketing conference, Matt Cutts, the head of Google’s webspam team announced a new tool for webmasters – the disavow links tool.
This tool, giving webmasters the ability to disassociate themselves from link spam, has built up much anticipation since Bing released their disavow links tool nearly three months ago, and could be seen as a savior to the many sites that were directly afflicted by Penguin. And while Bing’s tool may look more user friendly, Google’s formatting is much easier once you understand the process.
*Note: if this tool confuses you, DON’T use it. Hire someone who knows what they are doing.
Finding linking opportunities through backlink analysis requires you to have some common sense, basic knowledge of your competitors, an idea of what good links are, and you’ll also need access to a quality backlink analysis tool like SEOmoz’s Open Site Explorer.
I’m a huge fan of Open Site Explorer, but I’ve also come to like using Ahrefs Site Explorer, due to the in-depth reports you can run, such analyzing anchor text by terms, shown here by Ross Hudgens; however, with this example I’ll be focusing mostly on Open Site Explorer, since it gives me quick access to the metrics I like to see on the fly.