Frequently Asked Questions




General Questions

What do you mean when you say 'properly internationalized'?
There are many meanings and levels of internationalization. This Themezoom blog post does a very good jobs at explaining some of the various aspects of internationalization. TLKT™ does everything outlined in that post including:
  • Multiple languages per country in countries with more than one national language
  • Proper currency display and retrieval for each supported country
  • Multi-lingual stemming and stopwords in our LSI and Bayesian analyses
  • Searches are localized in all listed search engines (ie: France searches only pages in France)
What if I don't want it to be internationalized?
     Part of the unique selling proposition of the Theme Zoom portfolio of products is that we strive to be as rigorous and accurate as possible. One of the challenges in creating a keyword tool of this caliber is that we are not in control of our data sources. Different search engines will give various results based on personal preferences, the data center queried, the source of the computer doing the query, etc, and some search engines are better at internationalization than others.
     Due to these variances, there is really no 'right answer' to get from the search engines but we've found that the most accurate and useful data appears to come out when the searches are limited to specific geographical and linguistic markets. For example, if you are performing bankruptcy research in the US, you probably don't care about bankruptcy in the UK, Australia, or Canada in the slightest. We have found that performing research based on specific locales to be vastly more useful than performing a general search, even if the results vary slightly.
     In short, we don't want to give you general results because they aren't good enough.
How often does your data update?
All of our data is obtained dynamically and then cached for 3 weeks. If a refresh drill is performed on a project, all of the data that is older than 3 weeks will be updated.
I see 'Queue Size' in the various subscription plans? What is that precisely?
     Since each project, refresh, or import requires a large number of queries to fetch data, it can sometimes take a few minutes to finish. In order to properly scale our resource usage and guarantee equal access to each of our subscribers, we developed a queue system. Any time you create a new project, refresh an existing project, or import a set of keywords into a project, it creates an entry in a global queue. The entries are processed on a first come, first serve basis but the total number of entries in the queue for any single subscriber at any one time is limited to prevent abuse.
     There is a screen that tells you the status of all pending and recent requests and estimates how long it will take for them to be run. As we add subscribers to the system, we increase the capacity and speed of the queue accordingly to keep wait time as low as possible. Currently, the average wait time is less than a minute.
What does a 'Deep Keyword Inspection' show me?
A Deep Keyword Inspection allows you to view the top results for a particular keyword and a summary of the on and off page ranking factors. This allows you to view, at a glance, just what you need to do to outrank the competition in any given market.

Technical Questions

Does TLKT™ use phrase search or broad search?
Every query we perform uses broad search.
What is a custom filter and why do I want one?
You are probably used to lots of filters in other keyword tools. Limiting by competing pages, price, clicks, that sort of thing. That's all well and good, but it's of limited utility.
With TLKT™, you can create extremely complex, interesting filters that are relative to the size of the project and save them to apply them to other projects as well. Sure you can just limit by price and such, but you can also filter relative to the seed keyword as well as filter on arbitrarily complex mathematical expressions. Sound intimidating? Don't worry! We provide a complete set of default filters that may be all you ever need to analyze keywords in a whole new way. You can also use them as a starting point for your own research. Here are a few common examples that can replace other keyword tools entirely:

  • Niche Filter: keyword_lower_google_clicks_per_day > 0 and keyword_lower_google_cost_per_click > 0 and keyword_google_competing_pages < 300000
  • Relevance: keyword_lsi_score > 70
  • Market Segments: keyword_google_competing_pages < seed_google_competing_pages / 5 and keyword_google_competing_pages > seed_google_competing_pages / 100

For more details and some use cases for custom filters, view some of our screencasts.
What is a custom column and why do I want one?
Custom columns are like opinions .. everybody has one. One of the main selling points of ANY keyword tool is one or more columns of proprietary analyses that are supposed to help you in a way that only that keyword tool can. Far from knocking that concept, we embrace it completely. Rather than provide you a set of columns that make sense to us, we provide the ability to create every possible column. Here are a few examples of typical custom columns:
  • KEI: keyword_upper_google_clicks_per_day ^ 2 / keyword_google_competing_pages
  • Pay-per-Click Market Value: keyword_upper_google_clicks_per_day * keyword_upper_google_cost_per_click * 365
  • Total Search Market Value: (keyword_upper_google_clicks_per_day + keyword_google_searches_per_day) * keyword_upper_google_cost_per_click * 365

For more details and some use cases for custom columns, view some of our screencasts.
I want to delete keywords from my list, why can't I?
This comes up frequently. There is no technical reason that you can't delete keywords from a list, but as a matter of design, we try to encourage our users NOT to. As a research company, this is the single most difficult thing to teach our users (and they do usually thank us once they 'get it'). If you want a keyword not to show up, write a filter for it and save it so you can apply it to all of your projects. If you can't filter it out without too much collateral damage, then the keyword probably belongs there from an SEO perspective, even if you don't think it does. No doubt this is a point of contention and we may eventually add the ability to delete, but we strongly believe that it will be used as a crutch and actually reduce the value of TLKT™ over properly creating and saving a filter.
When I perform a search for a keyword in a search engine, I get a different number of competing pages. Why is that?
As we mentioned before, we perform broad search only and limit our results to specific locales (locale is a language and country combination). If you do the same in the search engines, you will get the same results.
What is LSI and why do I care?
LSI is an acronym for Latent Semantic Indexing and can be a bit of a hot button when SEOs get together and discuss strategy. Wikipedia has a great entry on what LSI is and how it works. We have also written an article on Latent Semantic Analysis because there is so much confusion and misinformation about it. As for how we use it, we fetch the top 100 pages in each search engine for the seed keyword, filter the pages for junk (tags, javascript, etc), then create a Latent Semantic Index and use that to determine a relevance score of each returned keyword to the seed as a matter of relevance. Because each corpus (set of returned pages) is different, the LSI score is a relative value rather than an absolute. A '10' in one project might be the same as a '16' in another one, which is part of the reason we have custom tunable filters and columns. Using the lsi_score_max and lsi_score_min in a filter allows you to write one relevance filter that can work for all projects. We use this ability in our LARI sample column that is provided with every subscription level.
What is Bayes and how does it work?
The Bayes column is a form of naive bayesian analysis that is used to determine the probability that a particular keyword belongs in the top 100 results of the search engines. Bayesian analysis has recently been most often used for spam filtering, but it can equally be applied to determine the probability that a particular keyword belongs in the top 100 results of the various search engines. Wikipedia has a great entry about bayesian analysis and spam filtering and if you think about it, determining whether or not a piece of mail is spam is a lot like determining whether or not a keyword is relevant to the top results. As an implementational detail, the score is negative and the closer it is to 0, the more relevant it is.
Why do I need an Adwords login?
There are no free tools from Google that give us this information so we require a valid login in order to retrieve it.