Friday, February 09, 2024

Keyword Metrics 3.0?

Here at SEO Research Labs, we've been doing keyword research for years. We've already changed the way we report data twice. You can see a description of the current reports here.

It's time to redesign the reports again, and develop a new set of metrics. I'm posting our list of factors to consider here on the blog, to invite your comments on what we should do next.

  1. Search volume forecasting: since the beginning, we have relied on search volume data from Wordtracker for our reports. Over the past two years, we've been monitoring the accuracy of Wordtracker vs. Keyword Discovery, based on actual search count data obtained from partners' pay-per-click ad campaigns.

    We still believe that Wordtracker is the most accurate source when it comes to forecasting the number of searches, but the gap is very narrow at this point, now that Keyword Discovery's premium database has come online.

    Some decisions we need to make include whether to offer a choice of data sources, or to include data from both Wordtracker and Keyword Discovery in our reports. My first impulse is to pay what it costs to include both, but some folks may find the "dueling data sources" confusing.

    The other big decision is whether to forecast the search volumes on the major search engines, based on market share data. The public sources of market share data report a very wide range of scores - depending on who you believe, Google's market share may run anywhere from 28% to 70%. To me, that says "useless information." My first impulse is to eliminate the market share forecasting altogether, since we can't be certain of its accuracy.

    Your thoughts, please.

  2. Click-through traffic forecasting. We have included a traffic forecasting tool in our reports for some time, which I have always found useful. However, this report is often a source of confusion, because its correct use is not well understood. We made some adjustments to the methodolody early on, to make this a more "pessimistic" forecast. We've made changes to the documentation to explain things better.

    The forecast is based on extensive data mining with real web sites. I've been running this analysis every 6 months (it's an expensive process). The last time we ran the analysis, the range of values on expected click-through rate for specific ranked positions was so broad, that I am just about ready to drop this report.

    Since this tool is already based on market share estimates, my first impulse is to stop trying to forecast, and offer a tool that will allow users to input their own estimates, perhaps pre-populated with "default values" from Netratings and our own data mining effort.

    Your thoughts, please.

  3. Competitive landscape - how many competitors? Right now, we use three metrics to report on the number of competitors. Total # of matches for the search term, total # of "in title" matches (search term in title tag), and the number of matches with the search term in the title and inbound links (title+anchor).

    I still like these metrics, but I am not completely happy about how we have to collect the data. Google's API allows us to collect this data fairly easily, although we are not able to collect data for more than 100 search terms in each report. We have frequent requests to collect more data, but it's just not possible.

    We've also included a "KEI" calculation for each search term, based on these metrics. My first, second, third, fourth, and fifth impulses are to stop reporting KEI in any form, since it is completely useless. We included it two years ago because a lot of SEO clients expected to see it, but if the market hasn't gotten any smarter about this since then, maybe we need to stop helping people stay ignorant.

    Other than dropping KEI, my first impulse is to leave this report alone. My second impulse is to stop collecting the data ourselves, and build something into the spreadsheet that would allow users to input their own Google API key and collect their own data. Would the trade-off be worth it? Would users be willing to wait 2-3 hours to collect the data? Would we end up with a tech support nightmare? Would requiring Microsoft Excel be a problem?

    Your thoughts, please.

  4. Pay-per-click bids: we've included bids from Yahoo/Overture, collected by Wordtracker, for years. This hasn't always been perfect because the search term Yahoo uses may not match the exact query found in Wordtracker, and sometimes we get no data on some terms. Now, it's academic. Yahoo no longer publishes this data, so we're dropping this report.

    One of the challenges with PPC reporting is that we can't (OK, won't) "steal" data. We won't screen scrape in violation of a search engine's TOS. So, unless there is a legal way to obtain the data, we aren't going to do it. With that said, it still might be possible to add another metric, for PPC competition. Would it be useful if we could report on the number of advertisers? Any other metrics out there?

    Your thoughts, please.

  5. Link competition: Big report right now. We take the top 10 results from Google, for the 100 most popular search terms, and we present a backlink count from Alexa, which is based on the number of web sites linking in to a ranked site. Sound complex? It's not, but it's a pain to explain.

    A hiccup with Alexa recently has forced us to reconsider this report, even though I like the number they give us. I am not sure we can rely on them. I also think it's just too much information for the average user to digest.

    My first and last impulse is to turn this into a "top sites" report. In other words, we're going to do it... Since we're pulling the Google rankings for 100 search terms, we can "stack rank" the top performing sites in the market fairy well. The current plan for this report is to show the top 100 sites, based on their overall presence in the search results for the 100 most popular search terms.

    Along with each site (listed by domain), we would show the total presence (# of times a page from their site appeared in SERPs), breadth of presence (# of unique URLs that appeared in SERPs), # of incoming links from Yahoo, and the # of sites linking in from Alexa.

    We can also show the same data for the client's URL in this report, so that users can compare their own presence to the top sites.

    Your thoughts, please.
Thanks!
Dan

2 Comments:

At 7:53 AM, Blogger Mikel said...

Wow - what a lot of changes, so here goes.

Wordtracker v Keyword Discovery - could you create the option for either. I agree that consumers may find the report confusing if they see two data sets, but comparing the two would be interesting. So long as we knew what we were comparing :)

Forecast on market share - ditch it. While you're there lets get rid of KEI too.

Google API & Data collection - I think you're starting down the split market route again. Asking consumers to get an API then using the spreadsheet, "you mean I have to enable macro's" could lead to a few tech support question, but us more techy may well be happy with it or enjoy the journey of discovery :)

Pay-per-Click bids - No data from Yahoo, wow I missed that one. The manual collect still seems to work! Does this mean that Wordtracker will be stopping this service too?

Overall I have to say that your reports (and tool kit) have been a great resource. Whatever the changes, I'm sure they'll continue to help.

Mikel

 
At 3:49 PM, Anonymous Anonymous said...

Google offers a tool called the traffic estimator that gives estimated bid range and ad costs for appearing in the top position 85% of the time.

it is easy to upload a bulk of keywords and have it process even 1,000 keywords at a time.

 

Post a Comment

Links to this post:

Create a Link

<< Home