A search engine optimization (SEO) suite is usually composed of several tools where each tool performs a very specific task, such as: keyword research, rank checking, link popularity check, crawlability tests, etc. In current suites there is limited or no collaboration between the different tools. Vendors expect users to be experts and to operate the tools as if the SEO suite is simply a toolbox and not a fully integrated machine.
Therefore what is needed is a SEO suite, that utilizes tools which are ‘smart’ in the sense that they are fully aware of other tools in the suite and what information they can collect from them, so that they don't need to ask the user again and again if they can obtain the information from data provided to a previously run tool or from its output (outcome). The tools should also have the ability to ‘act dumb’ as traditional SEO suites and request all information directly from the user. They should do this depending on the operating mode: smart mode or classic mode.
Traditional SEO software performs a task known as rank checking. For that task the user is required to provide: a) a list of keywords to check, b) one or several URLs and optionally c) a list of search engines. The tools then goes to each search engine and systematically checks every search engine result page to find the provided URL(s). The result of the tool is the list of specific positions the URLs have for the keywords and search engines. This presumes that the user knows what keywords people are typing to find the website.
What is needed is a tool that provides such functionality in our SEO suite that does not require any keywords or search engines to be provided before hand, only the site URL and a web server log file for the site that corresponds with the URL. The result of this is that users can check their rankings effortlessly, and the software will guess or discover the keywords and search engines users are using to find the user's site.
Traditional SEO software performs a task known as link popularity or backlink checking. Some more recent SEO tools go an extra step and perform what is known as link analysis—collecting the links and the anchor text (text inside the links) of those links (example http://seoelite.com). In order to collect the backlinks, traditional SEO software queries the search engines with the “link: or linkdomain:” command. The problem with that approach is that search engines limit the results to only a thousand links and some search engines (Google) do not provide reliable list of links or don't update the index with that information frequently enough. There is a large number of websites that get far more than a 1000 incoming links. Performing any type of analysis on such limited information means, that the SEO tools in the market don't provide meaningful results and don't help in such cases.
Therefore, what is needed is a tool that provides such functionality in an SEO suite that extracts a larger number of incoming links for the user website, because it extracts that information from the website's log file. The software should do this for every single page on the website automatically.
Traditional SEO tools analyze top 10 results by using statistical methods to compute relevancy metrics such as keywords density, prominence, weight, etc. The idea is to average the relevancy metrics of the top ten sites and use that average to optimizing the user's pages. There is a fundamental problem with that assumption. Averaging the metrics of all sites in the top ten for any search in a popular search engine, assumes all of them are ranking for the same reason. Unfortunately search engines use a lot more factors than just on-page relevancy metrics. Some sites rank for on-page metrics, others rank for off-page metrics and another for a mix of both. Extracting meaningful metrics that can be use to guide optimization work from disparate sources is extremely unreliable. Furthermore, there are sites that rank in the top ten short term (sites using black hat techniques) and there are sites that rank medium term and sites that rank long term. It is important in order to achieve solid rankings, that sites that rank long term are identified, and only the relevancy metrics be extracted from those.
Therefore, what is needed is a competitive intelligence tool in an SEO suite whose sole purpose is to identify sites in the top ten with solid rankings and only one of them is automatically selected based in the probability of the user's site matching or beating its rankings.
Traditional SEO suites include on-page optimization tools that perform on-page analysis: keyword density analysis, keyword prominence., etc. Some also include an HTML editor to make it easier for the user to make the necessary changes in the optimization process. Unfortunately, in order to see how the relevancy metrics are affected by the changes, the users need to upload the changes to their server and run separate tools every time they make a change. In a few cases the user makes changes locally on his computer, but yet needs to run the separate tool to see the new relevancy metrics.
What is needed is a SEO suite that includes an intuitive on-page optimization editor that not only uses the relevancy metrics extracted from the competing web authority as reference, but also shows the user's website relevancy metrics in a floating panel, that automatically updates as the user makes the changes to the web page.
Traditional licensing schemes generate a license key to a user and the user activates or unlocks the code by inputting that key on the software. Some software validate the licensing key on a remote server to make sure the license is not already in use. The problem with such scheme is that a clever user, such as a hacker, can activate a software on one computer and make an image (a duplicate of the licensed software) and copy it to one or more computers and the software will keep running, bypassing the licensing protection.
What is needed in order to prevent this problem, is when a user account is created, a license key and an authorization code must be created. This authorization code should have an expiration date on which time, the authorization code is used as the seed for a new one.