Mozcon 2012 Recap – High ROI Content Strategies For SEO

Cyrus Shepard – High ROI Content Strategies – @cyrusshepard

Good morning from Day Two of Mozcon 2012. After a killer API/Tools presentation by Richard Baxter, Cyrus Shepard impressed the crowd with a great presentation on High ROI Content Strategies. Here are some notes and highlights from his presentation:

Past, Present & Future

Cyrus started off by talking about the past and the future. The SEO community has seen lots of changes from Google and Bing targeting SEO. We’ve seen some techniques vanish, specifically around low-quality link building such as link farms, general directory submissions, article marketing, and others.

Google is changing on some of the non-spammy best practices like title tags. They regularly change your title tag in the SERP’s. Now they are aggressively targeting popular link building techniques like infographics that are being abused. Even anchor text is changing. SEOs are building links with the anchor text “click here.”

Stay Ahead of The Curve

One of the underlying themes to Cyrus’ speech was that we need to stop chasing what Google is changing. We need to be where Google/Bing are today, not using practices that they are trying to kill.


Some marketers feel like Google+ is a third sock. Cyrus realized that it’s not a social network, is a knowledge network. And Google+ has given us things other social networks don’t:

  • Followed profile links in your profile introduction
  • Profiles have PageRank and are indexed
  • More circles = more links to your profile page
  • Links in posts are followed and help get indexed quickly

Tip: Utilize Google+ profiles and posts to support your link building efforts and speed up indexation of new content.

Google+ is real content that’s indexed on the web like a blog, not just a social network.

Google+ is the only major platform with author photos. Cyrus tested different photos of himself and tracked changes in CTR in Google Webmaster Tools.

The results? He ended up increasing CTR on one of his sites by an average of 35% over time. There isn’t a magic picture or color that applies to all sites. It changes depending on your audience and demographic.

Tip: Test and optimize profile image – how many links would it take to increase traffic 35%.

Tool: – gives you stats on all your Google+ posts.


Cyrus has relied heavily on infographics but says they are becoming overplayed. Too many people are building poor infographics and manipulating link profiles by forcing embeds with keyword anchor text.


  • Never use widgets to force anchor text. Links will instead be natural and editorial.
  • Don’t link to the homepage. If you ever need to cut a page due to a Penguin problem, it’s easy to 404 a subpage but not your homepage.

eBooks Beat Infographics

1. Easier to produce – like a really great blog post

2. Less expensive – $250-$500 to produce one vs $500-$1000 for an infographic

3. Evergreen content – infographics peak. Ebooks provide value for years if quality is good

4. No saturation – Link sources generally are higher quality

5. eBook Directories build additional links

6. Can be repurposed into blog posts, webinars, etc.

7. Not on Google’s (Matt Cutts) radar yet

Press Releases

Cyurs estimates that 99% earn no editorial links and you actually could be hurting your chances at ranking by putting your exact match anchor text on a flood of duplicate content, low-quality sites.

There are times to use press releases though. Use it the way it was intended by announcing important things and linkable assets, not building links.

  • new, useful and newsworthy
  • tools
  • guides reports and ebooks
  • free giveaways
  • videos
  • data visualizations

You should hire a professional on some services like Contently, odesk,, guru, elance, etc. Find someone with a history of getting content placed on REAL sites.

HARO – 3 steps to success:

  1. Use Gmail canned response for efficiency
  2. Link to expert author page to show qualifications
  3. Link to your top articles to establish you as an expert, reports may also link to these articles

Anchor Text

Google doesn’t need anchor text as much due to better signals from the linking page:

  1. keyword proximity
  2. reading level
  3. title tag
  4. page relevance
  5. domain relevance
  6. inbound links
  7. author authority
  8. contextual analysis
  9. block analysis
  10. term weighting


It was a really great speech from Cyrus with some actionable tips and awesome insights and theory into the way things used to be, the way they are now and the direction we’re going in the future.

Stay tuned from more 2012 Mozcon wrap-ups today and tomorrow!

Mozcon Day 2 Recap – Online Reputation Management

Rhea Drysdale, @rhea at Outspoken Media, began her presentation on ORM by announcing that it makes no sense that she does online reputation management because she tends to be crude and outspoken! I enjoyed listening to what she had to say and her delivery, and I have a few takeaways to share.

She asked the audience two questions to get everyone thinking:

  1. Your CEO just got caught embezzling funds. What are you going to do and are you prepared?
  2. You represent a really large consumer products company and your product just hurt a lot of kids, what are you going to do? What about your search results? How long do you think it will take to recover?

It takes six months to one year to recover your search standing and three years for your brand to start to recover. You never know if or when you will be hit with the unimaginable. Companies need a long term strategy that will allow them to effect true organizational change.

Brands don’t have insurance policies, but there are things that can be regulated and implemented to ensure that you are covered in the event of a reputation scandal.

  • Social media policies
  • Community management
  • Use listening tools
  • Brand development
  • Customer service
  • Affiliate management (have policies in place for what your affiliates can and can’t say)

In the next 10 years there will be 21% increase in PR jobs. This is in direct response to businesses and government agencies needing to respond quickly to news and information that moves rapidly on the Internet and in social media. Crisis management/communication is not new, we’ve always had disasters happen. ORM works to fix the problem rather than treating the symptoms.

Organizational change requires trust, conflict, commitment, accountability, and results. This is where most companies fail, as most ORM issues are due to a problem in one of these areas. And, the most common reason for an ORM problem is poor leadership. You can prove that there’s a business case for implementing online reputation management:

  • Identify your organizational allies & threats (SWOT analysis)
  • Do a needs assessment (look at Autocomplete results in Google, just don’t hit enter!)
  • Look at search results
  • Monitor social mentions (
  • Customer feedback (
  • Understand customer loyalty (know your Net Promoter Score)
  • Look at conversions
  • Do competitive analysis (look at branded queries, purchase related queries with Google search suggest)
  • Audit corporate communication ( database)
  • Audit customer communication

ORM is not magic, it’s just SEO. Are you prepared? We can’t insure or replace our brands. We can protect them. Be brave and care, start implementing!

Growth of an SEO: Using Social Media & Networking to Grow your Knowledge & Reputation with Dan Shure (@dan_shure) on #seochat

On Thursday May3, 2012 Search Marketing Weekly will be hosting an #seochat on Twitter. Dan Shure will be our guest answering questions about SEO and Social Media.

Starts at 7:00 pm Mountain Time
About 1 hour long
Use hashtag #seochat

Host: Becky Jutzi

Dan Shure

Dan Shure of Evolving SEO

Dan Shure (@dan_shure) is Owner of Evolving SEO in Central Massachusetts. His obsessive love for SEO began after building and optimizing websites for some small businesses in 2007, and it took off from there. Evolving SEO has been officially operating since 2011, and Dan is also an SEOmoz Associate.

Should I Consider Redesigning My Website?

Bad web design isn’t just about GIF backgrounds and badly chosen color schemes; it goes down to the bones of your website and how your content will affect search rankings. It might seem like a no-brainer that your site design and content should be geared toward the user, but that principle too often goes out the window in favor of just creating a page that provides basic functions for the business.

Reasons To Redesign Your Website - Infographic


I’m always interested to see data on how people behave on the Web. This infographic zeroes in on some very interesting behaviors when it comes to search. Most people don’t go beyond the first page of search results before they click on a link. And if they don’t find what they want in the first search, 41% will refine their search query or switch search engines rather than click through to the second or third page of results. That’s interesting because it indicates that people are making decisions about whether websites have the info they want before they even get to the site. If your page titles and meta-descriptions aren’t reflecting the actual content on your site, people are less likely to click through, even if the content they want might actually be there. Even an artfully designed site will suffer from lack of attention to the details.

What’s even more interesting in this graphic is the fact that over three-fourths of users ignore the paid search links and focus on the organic listings. People know that the paid links aren’t getting them where they want to go, so they’re taking pains to avoid them and go straight to the organic results. Just that behavior alone makes it startlingly clear that the content and services of your site are the foundation for any other optimization that’s taking place.

It’s no wonder that a good portion of the infographic talks about the frequency with which experts recommend redesigning a site. Google changes its algorithm, trends in each industry shift, economic forces push users in different directions — the rest of the world is evolving. Why shouldn’t your site?

About is an internet marketing firm who specializes in Search Engine Optimization. helps clients achieve top search engine placement and grow their revenue through holistic SEO strategies. For more information about, visit their site or check out their search engine marketing strategies blog today!

How To Identify True Search Competitors – SEO Competitive Analysis

At SMX Advanced back in June I attended a session where I heard about a really cool idea. The concept was that you could take a ton of keywords, and then map out what sites were showing up on those keywords to really get a good idea of who your real search competitors are.

The bad thing was that they said that they used their own developed software. I’m always bothered when I get excited about an idea, but then have no way of reproducing it. Well, I did some digging around and was able to find a way to do a similar type of report. Sure, it’s more labor intensive, but you can really learn some thing about who else is competing for your same keyword sets.

Below I’m going to outline the techniques I use to get the data, how to organize it, and what to do with it once you have the information. I admit that it may be a little choppy, so if you have any questions on it feel free to hit me up on Twitter: @dan_patterson

This technique was also mentioned by my friend Matt Siltala in his Pubcon presentation last week, which you can view here: Competitive Intelligence Pubcon Las Vegas.

Step 1 – Get a List of Keyword Sets

Since the goal of this whole exercise is to find other sites that are going after the same keyword sets as you, the first step in this process is to come up with a large list of keywords. A good place to start is to go through you analytics to find keywords that you’re already getting traffic from.

Once you have a list, break them up into topic sets. This way, you will be able to find which sites are full competitors or just partial competitors. Come up with a short name for all of these sets, and use this when you’re scraping results to identify which set that scrape belongs in.

Step 2 – Scraping the SERPs

There are plenty of tools out there that you can use to scrape the search results. Some of them use proxies and other tactics that the search engines aren’t fond of, so instead I’m going to go over two tools you can use that shouldn’t raise any of these problems.

The first is a handy little Firefox plugin called OutWit Hub. The second I’ll go over is the SEOmoz Pro Keyword Difficulty & SERP Analysis Tool for those of you that are already members of SEOmoz Pro.

Scraping SERPs with OutWit Hub

1- Download OutWit Hub

Go to and download OutWit Hub (Free Firefox Plugin). Technically it’s a site content scraper tool, so we’re going to use it to scrape URLs from the SERPs.

OutWit Hub

2- Change Your Google Search Settings

In order to effectively use the plugin, you’re going to have to change a few of your Google search settings

  1. Turn off Google Instant by going to your account Search Settings, and then choosing “Do not use Google Instant”.
    Turn Off Google Instant
    Google Instand and Number of Results
  2. Decide how deep you want to look and set “Number of Results” to match. You can choose 10 (default), 20, 30, 50, or 100.
  3. Save your preferences and go back to Google Search

3- Scrape

Do a search for your first term. Once the SERPs come up, click on the OutWit Button in Firefox.

OutWit Button

This will give you the OutWit Hub Window. In the window, click on the ‘Guess’ option.

Guess Button

This will give you the info from the SERPs. We’re most interested in the “id” (rank) and “URL” columns. You can either export this info or just copy and paste it into Excel.

Ann Smarty did a post about OutWit a while back and also has a custom scraper you can use for Google results. The only problem I’ve found with this is that sometimes you’ll get URLs with spaces in them from breadcrumbs, which makes it a little harder to filter things down in Step 3. If you are in a niche that doesn’t have this problem, this can be a faster way to go.

4- Download Your Scrape and Clean It Up

One problem with OutWit Hub is that it can be inconsistent. Sometimes you get local listings in the export, sometimes you don’t. Sometimes you get paid listings in there. Somteims you don’t. So you have to watch what you’re scraping and make sure you’re actually getting the right info. They usually have a heading row, but you still have to do some filtering and cleanup work to get an accurate list. When you do this, make sure you also update the id (rank) column to reflect the real ranking you’re seeing.

You can either export the data to a CSV, or you can also just copy and paste it into Excel. I like the copy and paste option because if I see some paid ads at the top or bottom of the data, I can just not copy those rows.

5- Rinse and Repeat

This is unfortunately the labor intensive part of this whole process. You’ll have to repeat this process for all of the keywords you want to check. Again, there are other tools that do a little bit more brute force against Google, but OutWit Hub is a great FREE tool that will help you get the data you need if you’re willing to take the time.

No matter which method you use, make sure that you add a column at the beginning that includes your shortname for each set before the rank and URL of each scrape. This way you can identify which set the rankings and URLs belong to later.

Also, make sure you’re combining all of your data into one spreadsheet so we can do the comparison and filtering later. In Step 3 of this whole process I’ll show you what to do once you have all of your scraping done.

SEOmoz Keyword Difficulty & SERP Analysis Tool

SEOmoz Keyword Difficulty & SERP Analysis

In the long run, I think that using the SEOmoz tool is a lot easier and cleaner to use for this exercise. One nice thing about using the Keyword Difficulty & SERP Analysis Tool is that you can run up to 5 keywords at a time, and you don’t have the cleanup work that you have to do with OutWit. Once difference between the two is that with OutWit you can dig as deep as you want to set your Google Search settings. With SEOmoz you will get the top 25 and that’s it.

Here are the steps to getting the same data with the SEOmoz Keyword Difficulty & SERP Analysis Tool:

1- Run a Report (up to 5 at a time)

Sometimes I’ve found that the tool will time out if you run 4 or 5, so if you’re having that problem just run 3 and you’ll have an easier time.

2- CSV Export

Once the report loads, click on it and then choose the “Export to CSV” link down towards the bottom. It’s above the table with all the pretty greens and reds.

CSV Export - SEOmoz Tool

3- Rinse and Repeat

The only columns we need are ‘Rank’ and ‘URL’. If you want to start getting in to Domain and Page Authority comparisons you could use that data as well, but for this blog post I’m just going to keep it simple.

Just like with the OutWit Hub data, make sure you’re combining all of your CSV download into one master file so you can do the filtering you’ll need to do.

Step 3 – Filter Down to Just Domain Names

In order to really do the comparison, you need to filter your SERP scraping down to just the domain names. With a little Excel formula magic, this is easily done. Here are the basic steps to follow in Excel. Since there are so many different version of Excel and other spreadsheet programs, I’m just going to give you the basic steps and formulas here so you can do what you need to in the program/version you’re using.

Before you do the steps below, make sure to MAKE A COPY OF ALL YOUR SCRAPED DATA. We’re going to filter down to just the domain names you’ve scraped, but that’s only so we have a list of unique domains that we can then do some counting and averages on. You have to leave your original data so you can get the counts. So I repeat, make a copy of all your scraped data and do the steps below on the copy.

  1. Use ‘Text to Columns’ and delimit on ‘/’. This is probably the easiest way to break out the http: and any other folders in the URLs you’ve scraped. Delete all of the columns that don’t have just the domain name.
  2. Get rid of www. Since some of your scraping with have URLs with www and some won’t, we need to get rid of these. Sort your list of domain names alphabetically. Then, do another text to columns on the domains that have the www in them. You can do this the easiest by doing the ‘Fixed Width’ option since www. is always the same width. You may also have other subdomains in your list, but honestly I would just treat these as separate sites from the main.
  3. De-dupe. Now that you have your list of just domain names without the www and folders, de-dupe this list so that you have a list of unique domain names.

Step 4 – Count # of Results and Average Rank For All Unique Domains

Once again, for this part I’m just going to give you the steps rather than screenshots since it might vary a little bit from spreadsheet program to spreadsheet program.

To set up your spreadsheet matrix, you should have all of your unique domains down the left, and then across the top you’ll have a column for # Results and Avg Rank for each of your keyword set shortnames. Put these at the top of the two columns and merge over them if you want to make it a little prettier, and it will give you something to reference in your formulas.

Getting Number of Results Per Unique Domain

  1. This formula will vary a little bit based on how big your data set is and where your list of original domains is.
  2. For example, if your first unique domain is in cell B3, your full data set of URLs is from cell D120 to cell D293, the column in the data set with the keyword set short names is in cells A120 to A293, and your first short name column name is in cell C1 your formula would look like this: =COUNTIFS($D$120:$D$293,”*”&B3&”*”,$A$120:$A$293,$C$1)
  3. Notice the absolute references for the cell ranges. This is critical, otherwise you won’t get the correct count.
  4. The “*”&B3&”*” is a wild card that basically says match B3 with anything before or after it. So you’ll get www, non-www, home page, and any other page for that domain name.
  5. If your formula looks good, copy it down to all of your unique domains.
  6. Repeat this process or all of your keyword sets.

What this number tells you is the number of times that unique domain shows up in your scrapes for that keyword set. If they show up a lot, than that’s something they are going after, and can be higher if they have multiple listings as well. If they don’t show up very much, than it isn’t an important set for them.

Getting the Average Rank Per Unique Domain

  1. This formula will also vary a little bit based on how big your data set is, etc.
  2. Let’s use the same example cells as listed above, but your Rank data is in cells B120 to B293. Your formula in Excel would look like this: =AVERAGEIFS($B$120:$B$293,$D$120:$D$293,”*”&B2&”*”,$A$120:$A$293,$C$1)
  3. Again, notice the absolute references and make sure you have them in there.
  4. If your formula looks good, copy it down for all of your unique domains.
  5. Repeat this process or all of your keyword sets.

What this number tells you is the average rank for that domain name for that keyword set. Naturally, the lower the number the better they rank on average. The higher the number, the less of a threat they currently are, but it also shows that they are at least showing up for that set.

Step 5 – Organize and Analyze

Once you have all of your formulas down and you’re happy with what you see, I recommend copying and then pasting back the values for your matrix. This way you can sort the data any which way you want without it messing up the data (I made this mistake once and it wasn’t pretty).

Now you have a really cool matrix that will show you by keyword set which sites are going after different sets, and how important each set is to them based on how often they show up and what their average ranking is.

Have some fun sorting by different columns and even highlighting the numbers and sites that stand out to you. Here’s a screenshot sample of a matrix I did once to help.

Sample Data

What To Do With This Info

One of the problems with competitive analysis is that site owners and marketers only look at the companies they know, the major players in their space. Well, with this technique you will also see the affiliate sites that are competing that you may have overlooked, how big of a player sites like Wikipedia are in your space, etc.

If you run this every couple of months, you can also see the changes that are happening in the SERPs and better keep an eye on those sites that are becoming more of a threat.

As you identify new competitors, you also now have another site to analyze for marketing ideas, competitive links, etc.

Let Me Know What You Think

I really hope that this has been a helpful post for you to learn a technique to identify more of the true competitors in your space, and then what to do with that information. I’m sure that there are other ways to get this information, and if you have any additional tips please share them in the comments below.

Infographic – The Stars of Search Marketing via

This is a really cool graphic that we just published over at It’s got some interesting facts about the brightest stars of search marketing… Hope you enjoy!
History of Search Infographic

Content Is King, Autobloggers Fail

Well, in my whole life of being a webmaster, content has always been king – where traffic drives people to your content, so what does that mean? It means if you have great, quality content, then people will recognize you for the things that you write about, and are passionate about.

With an autoblog, you’re just copying other people’s work, and not bringing true definition of “content” to the webmasters community – and when Google started cracking down on Autoblogs, people started whining about it, but why? Because they were cheating the system and making money at the same time, which is very wrong.

So, we’re back in the days of “Content is king” where traffic drives your content – so if you hear someone say that SEO is dead, tell them no, autobloggers are dead, and that’s a fact. Google doesn’t want spam sites in its search – and us webmasters are sick and tired of trying to battle it out with autobloggers that steal our content – so that means, we fight back, by stepping up our link building campaigns, and content skills.

Let’s say you ran an autoblog and then decided to change to a “real” blog, your chances of success are slim to none because Google already knows where your content came from.

So, content is still king, and will always be king – and true webmasters define “SEO”, by writing quality content for visitors, and not spiders.

So, do you agree that content is still king and that auto bloggers fail?

– Written by Cpvr, owner of Virtual pet list

SEO Your FAQ Section

A wealth of content opportunities

to soft sell your business

Your “Frequently Asked Questions” (FAQ) section on your website is undoubtedly one of the most visited pages on your site. It’s a straightforward way for curious readers to find the answers to their questions about your business and its offerings. Your FAQ page is an indispensable element of your overall customer care strategy.

In addition to providing a wealth of helpful information for your visitors, your FAQ section can pull double duty as a means of getting your website to rank high on search engine result pages (SERPs). Appearing high on the SERPs will drive new traffic to your website. You can get your FAQ pages to appear high in the SERPs by leveraging their content to gain greater exposure for your keywords.

How can you utilize your FAQ section to drive more traffic (which can translate to sales) to your website? Of course your FAQs need to be written in clear and uncomplicated terms but using some search engine optimization strategy on the page will help it come up contextually in an Internet search.

Anatomy of an FAQ Section

Let’s dissect a typical FAQ page into its individual components to discover how to optimize each so that it will help the page itself be ranked high in searches.

  • Heading – Title your page descriptively, being certain to use keywords in it. When the title is displayed as a heading in the page’s code, the search engine is even more likely to discover it.

Example: Instead of the heading on the page reading a generic FAQs, name it using a keyword or two about your services, such as, FAQs About Care for Curly Hair. By embellishing the heading title with the keywords Curly and Hair, the search engine can latch onto those words which describe your services and display them in its results.

  • URL – The URL of the page displayed in the web address box is an important way that search engines locate information. Rather than a default URL that may be automatically generated by your website’s content management system (CMS), customize yours to include the actual page title.

Example: Your content management system will create a “name” for each page of your website automatically but it may consist of numbers, letters and characters that have meaning only to your CMS itself. You can manually access this system and change the default URL to include whatever you have named your individual pages. A search engine friendly URL will look like this: By using actual words, the search engines will find your keywords.

  • Questions and Answers – When crafting your frequently asked questions and their answers, be sure to include your keywords, as appropriate.

Example: To a reasonable extent, make certain to utilize your keywords generously so they will be present on the page when the search engine spiders crawl your content. Instead of a question reading, “Do you do bridal styles?” try something like, “Does Hair-a-Plenty do bridal hair styles for wedding parties?” Those extra uses of your keywords will help your FAQs appear in SERPs for your keywords Hair-a-Plenty, bridal hair styles and wedding parties.

  • Anchor text – When linking to other pages on your website, be sure to create your links on your actual keywords.

Example: Rather than linking to the words, click here, click to the keywords that actually describe the link, such as, hair conditioning treatments.

  • Embedded media – This includes any photos, diagrams, videos or audio files that are used in the presentation of your FAQs. Be certain to add your keywords to the titles and alternate text for this media on your website. Search bots crawl that text as well, so it presents yet another opportunity to have your content found in audio, image and video searches on your keywords.
  • Tags – Tag your page content with descriptive tags that are your keywords. Search engines look at tags, too, in order to discover the content of a webpage.
  • Sitemap – First of all, have one! While not a part of your FAQ page, it is important to make certain that your Sitemap does include your FAQ page.
  • Keywords – A strategic note on writing click-worthy FAQ pages – Be conscious of keyword placement on your FAQ page. Because SERP readers are more likely to consider a result relevant when they see their search terms appear in the snippet that is displayed on the SERP, it’s especially important to use relevant keywords as close to the top of  the page as is appropriate.

These strategies for search engine optimization present a wealth of opportunities for your FAQ page to be one of your most effective at driving traffic to your site. Qualified traffic that comes from strong keyword placement means more sales.

Debra Leitl is the Mentor in Residence at you can find her on twitter@MentorMarketing. Her specialty is interactive marketing with a focus on ecommerce and online marketing strategy. She has over 15 years of professional experience in online retail, ecommerce consulting, and new business development.

Local Search Marketing with David Mihm

Guest: @davidmihm – a local search guru and his local search ranking factors project is among the most important studies of local seo. Here’s a link to the Local Search Ranking Factors project that @davidmihm puts together @davidmihm is also the co-founder of – a tool used to assist many search marketers with their local optimizations. New releases coming on + Local U Portland, Local U Spokane, #searchfest @sempdx.

What are the three most important things you must do to rank well in the local results?

Flippant response: ‘location, location, location’. Google would say ‘distance, relevance, prominence’ On a practical level:

  1. Consistent NAP+W (Name Address Phone Web) across
  2. Strong location signal from your own website (Title Tags w City, State / HTML address, phone). By HTML address i mean your physical address in HTML (as opposed to Flash or image) ie 123 main st, portland or 97209.
  3. Inbound links from geographically-relevant websites. Anchor text less imptortant, IMHO.

Q: @lyena Is the proximity to city’s downtown still a factor or not that much anymore?

A: Proximity to centroid is less important now. Mobile phones = multiple centroids + organic plays larger role.

Q: @shuey03 Links to your website? or place page? or both?

A: Links to your website. Organic matters more than it used to with new Place Search UI (

Q: @garyjmag Is hCard ideal?

A: Yes, hcard microformat ideal for putting address on your own website

Q: @RobynStorms: Any tips when the client doesn’t want there address to show? How can I further optimize their local listing?

A: Google allows businesses to hide address but it is impossible to gain citations elsewhere without publishing. Google really designed places around in-person meetings between customer and business owner. It is baked into its DNA.

@zacpalmer: option to hide address …city will still show.

What are your favorite KPI’s you analyze to track the success of your Google Places Page?

Best KPI is OFFLINE tracking of number of calls to business. Have SMBs ask callers how they found their business. @mvanwagner cites great example of local bakery client that keeps track of phone call sources with different color post-its. @seoverflow published great tracking mechanism back in days of pure 7-pack. O-pack has made that less relevant.

Q: @garyjmag: We use Voicestar/Marchex call tracking as the main KPI for local search. @igobydoc: Voicestart/Marchex works awesome! Love that service!

A: I advise against any call tracking system currently. I’m not the only one (cc @si1very). It messes with your NAP In my experience business owners are savvy about call sources. Tracking is more important for marketers to justify services. What happens to the phone number when business owner wants to cancel contract with provider?

@kmullett: We have used singular numbers with call tracking, stays consistent, serves metrics.

@aknecht: I like using a call tracking solution if its a retail outlet that lends itself to calls. People claiming Google offers also good. Call tracking is fine as long as it is one consistent number for Places & the default for your website.

Q: @lmgilson: Can you still show up in local search, if you don’t have a website?

A: @iNeils: Yes. Plenty of businesses in Places with no website.
@ashbuckles: I think ranking locally without a site will continue to become more difficult for many companies.

Q: @AnnieCushing: What recommendations do you have for mobile businesses that just list a service area?

A: They need a physical home base SOMEwhere. Even if certain engines don’t require physical address, google can’t spider if not published; hurts with prominence.
@Reesale: Yelp doesn’t require a physical location. I haven’t tried it but a rep told me that in person.

@matt_storms: Sometimes a client is so rural they can’t have a mailing address so they have to have a physical.

How can you integrate call tracking (different phone numbers) to track place page activity? Is there any danger in this?

Some background on why I feel call tracking is bad in Local and @mblumenthal speaks on ‘digital equity’; Facebook presence vs website presence. Phone number works same way. It is your permanent thumbprint. As an SMB you do not want to be beholden to any sole provider for that thumbprint. Call-tracking is fine as long as numbers are unindexable (javascript, un-alt’ed image). Most placements not that smart. Tracking numbers are OK on your own website where you can implement them yourself, not in directories.

Q: @aknecht: Would you recommend service business register a physical address such as a UPS mail box?

A: I would not recommend UPS or PO Boxes. Google is adamant about physical locations where you can meet someone. I think Matt Cutts actually did a video a couple weeks ago.

@aknecht: Google Places doesn’t like PO Boxes, so use Suite instead – host site, knows what it means if mail were to arrive. Just be sure if you used Mailbox Etc or UPS etc. that it is as certain as possible that no one will use Place to come and visit you. Other alternative (just more $$) is shared office space. Pay for a name on the door & a meeting room when needed.

@matt_storms: You cannot use PO boxes. It violates Google Places Terms and Conditions.

Are there any secrets or tools you use to find good citation opportunities?

If u haven’t read this VERY old post, you can mine your competitors’ place pages for citations. Google is obfuscating ‘more about this place’ though. Pure organic results for geo-relevant phrases now yield best possibilities, IMHO. – For citations in particular that is best i’ve seen. People might check out @seoverflow but it is in beta.

@Reesale: Have any of you tried yext rep? It’s free and useful. Finds your listings and monitors reviews.

@kmullett: You can use Google operators to find competitors listings as well.

What significant changes to local search do you predict will surface in 2011?

I see vertical sites being more and more important. Carter Maslan @kelseygroup ILM in Dec: major role for vetted niche directories. Google is clearly throwing lots into Hotpot. @mblumenthal has compelling data on results in PDX. TWT places, FB Places, Hotpot. @4sq @gowalla etc checkins have had potential as ranking signals but userbase too niche. Hotpot FB TWT much broader. Geo-social overlay is clearly where this space is headed. For near-term, critical for SMBs to have comprehensive review acquisition program. Important to ENCOURAGE conversations by customers, not fear them. (got that line from @mattmcgee).

@garyjmag: Local search impact in 2011: Check-ins on Latitude!

Q: @lyena: @davidmihm Do you see HotPot being a factor in local search soon or is it already?

A: Google is now showing Hotpot records to logged in users. will surely be testing clickthrus. Too early to tell right now.

Q: @garyjmag: Are number of reviews a downgraded ranking factor in local search lately?

A: If anything I would say opposite.

Lyena Solomon is an internet consultant who helps businesses make money through effective websites and profitable online marketing and advertising. Lyena is a website analyst and strategist with extensive development and website usability experience. When helping companies, her main focus is on return on investment, profitability, goals, targets and visitor engagement. She has been in internet business for 15 years.

And We’re Off To SEO The Wizard

If you have ever tried to organize an SEO strategy it is easy to feel like Dorothy, Toto, and her friends when they travel through the Winkie Country.  As they are making their way through the Winkie Country that is ruled by the Wicked Witch of the West they are attacked by wolves, Winkie soldiers, birds, and bees but they are still successful until the Winged Monkeys take them over.  A search engine optimization campaign can feel a lot like this experience if you are not aware of the hazards of the Winkie Country.

The SEO Cyclone

Although the journey through the Winkie Country describes what it is like to try and navigate through SEO, the story really begins with the SEO cyclone when you are attempting to figure out where to begin and you are trying to survive the cyclone of SEO information.  If you search Google with the term “SEO” you are bombarded with a dizzying storm of information, all of which claims to be the surefire way to succeed with SEO.  Before you know it, you are like the Wicked Witch of the East when the house finally lands on you.

The Road Through the SEO Forest

To make the road through the SEO forest a little easier, let’s break down the most important information into a nutshell.

  • Robots: Robots.txt is a file that you place in the root folder of your website directory.  The file is an important part of your SEO strategy because it provides you with control over what pages in your site are indexed with the search engines and in some cases, your pages are indexed faster if you include a robots.txt file.
  • Ranking: Your website’s ranking with the search engine is another important element in your overall SEO strategy.  There are over one hundred different factors both positive and negative that affect your website’s position on the search engine results page in Google.
  • Relevance: When you create your SEO strategy all of the SEO elements must contribute to your website’s relevance.  For example, the keywords you choose should relate to your content.  The sites that are pointing to yours should relate to the topic of your website.
  • Results: SEO involves a series of trial and error strategies but the ultimate goal is to get your website on the first page of the search engine results across a variety of different keyword combinations.  Real SEO results happen when you are successful with driving highly targeted organic traffic to your website.

Deadly SEO Poppy Field

There are a few things you should be aware of that can kill your SEO ranking chances before they are out the door:

  • Don’t Ignore H1 and H2 Tags: H1 and H2 tags point out the most significant parts of your web page to the search engines when the spiders crawl your page for indexing.  The tags tell the search engine spiders what the page is about and why it is significant.
  • Failing to Do Keyword Research: If you do not know what words your target market uses to find your website you may still attract traffic but it will not be the right kind of traffic that converts.
  • Content That Does Not Deliver on the Promise of the Link: If you create links solely to earn points with the search engine this will not help you in the long run.  If you provide links to additional content make sure the content delivers on the promise you made in the link.

Winged SEO Monkeys

Quality content will help keep visitors on your site and protect you from the Winged SEO Monkeys.  If you provide inferior content you will be unable to retain your visitors not to mention getting captured by the search engine spider monkeys.  Quality content is information that is useful to your readers and relevant to the topic of your website.

Home Again

There’s no place like home when your homepage design has an opportunity to help your SEO with Google Instant Previews.  This is a feature offered by Google that allows your page to be previewed in the search results when the visitor clicks on the image of the magnifying glass to immediately see a preview of your page.

If you imagine an SEO strategy in terms of the yellow brick road and focus on the elements we have outlined here, this will guarantee that you get to see the SEO wizard and he will drive targeted traffic home to your website. This post was inspired by The Wonderful Wizard of Oz.

Debra Leitl is the Mentor in Residence at You can find her on twitter@MentorMarketing. Her specialty is interactive marketing with a focus on ecommerce and online marketing strategy. She has over 15 years of professional work experience in online retail business management, strategic management consulting, as well as new business development.