Who else wants to get the Google Rankings Factors that professionals know?
The inner workings of the Google algorithm to help your own website rank?
Today, for FREE - download a chapter from my new book, Optimizing Your Website
and you'll have it in minutes. Enter your email below:
In the field of search marketing, specifically search engine optimization, watching and tracking your website rankings, performing detailed and ongoing keyword research and managing link programs and competitive analysis are vital to your business and building search engine rankings.
While much of this can be done manually, you simply don’t have the time.
Perhaps you can keep up for a while on one domain, but if you are smart – you have applied a multiple domain strategy, supporting your niche market place with a domain ring and rich branding via URLs in PPC campaigns. Under this model, you’ll be more strapped than ever – but you need to continue building.
Start by using a basic, but powerful SEO toolset that will do the job for you. Here are some tools that I use, and consider them indispensable search tools:
I have compiled a list of additional tools that you may use – some free and paid – on this comprehensive SEO tools list page.
New site owners are always eager to get listed in the search engines, and why not – it’s such an ego trip to see your site in the search engines for the first time (hopefully for some relevant, searched keywords!), and to think about the amounts of money and information you can exchange online. Call your friends, partners and your business networks and just tell them you are “all over the Internet”.
However, many spend too much time doing it incorrectly, manually, or via software automation (spamming). They add 100′s of links and descriptions to as many search engines and related directories as quickly as possible. They feel that this exercise will surely avert the dreaded (and mythical) Google Sandbox, where new sites appear to be “stuck” with no ability to be searched or found – for weeks, possibly months. And, they believe that Pagerank and Authority will quickly be built.
Not only are you possible spamming the search engines and directories – but you are not going about it the right way.
First of all, the Google sandbox does not exist. While true that newer sites will receive a higher level of scrutiny from SE’s like Google – there is no real truth to that rumour. However, you should not try to fix it by adding your site to 100′s of search engines (there’s really only 4 that matter) – instead, setup and organize your site and links, and you will have no problem at all.
If you want to add to search engines and directories, use a more comprehensive strategy, and add to these sections with the top 4 engines – Google, Yahoo, MSN, Ask:
Ask.com has additional links you might want to check out as well:
The simple secret to effective web search index inclusion uses the strategies above, but it first starts with links – get as many high quality and relevant ones as you can – and you’ll be included faster than you can say “am I out of the sandbox yet“? Do this over time, not in one day, or one week – and you’ll see your traffic and business grow.
(To this day, I have never used the links for search engine submission, I simply build links and search engines come spidering. But, for products, directories and shopping portals, you should use the above).
The search landscape is changing. That’s nothing new. In fact, it changes all the time, for reasons you may not expect or even think about in your daily online work.
Your competitors are changing, the search engines are updating their algorithms and finding new ways of providing the best opportunity for relevancy (and sales). And – you (hopefully) are also adapting; adding content and tuning your sites for optimial organic results placement. Talk about a moving target!
On top of that, spammers and black hat seo artists are constantly trying to beat the search engines – creating a volatile place for any sense of normalcy.
Example: I was doing some research using geo-targeted keywords. I noticed that a previously #1 page listing on Google was pushed to page #2, and it seemed so sudden.
What the heck happened?
Answer: Universal Search – a new search engine results format, a new way of presenting data – capturing video, maps and more from the emerging “social” aspects of the web.
How do you fix your ranking problem in this diversified way of looking at data?
(disclaimer: just adding these tactics will move you in the right direction. However, there is ongoing work needed in this area, and it’s a long term committment). Don’t exclude this emerging and important new wave – and make sure to optimize for the social media.
A good example of how to see multiple entry points in Google results, is the famous Darth Vader example. (scroll up/down the page to see entries from images, news, products, video and more).
If you are only applying the old SEO strategies – you will be left behind in 2008 and the competition will race past you.
In the field of SEO (search engine optimization), there are a myriad of things you can do to affect rankings in search engines.
While its basic premise is simple (content + links), you cannot pick up strategy and tactics in a weekend or two. A long term committment is needed, and you need a combination of technical, entrepreneurial, design and marketing skills and certainly – copywriting skills.
But, with those writing skills, you can get started today!
Last week, I was introduced to a company and their current website. Like most serious businesses, they wanted top rankings for some related keyterms (which is good). When I did a more in-depth keyword research (this is where you must start) – it revealed none of their selected terms were used online (not good). You should still consider those related terms and test them via PPC & SEO. Monitor trends in your server and anlytics logs for performing keywords, and continue keyword research.
Keywords with search counts with at least 100-200 searches per month (try the Overture Suggestions Tool) could form a good starter strategy.
Try first to lift your rankings in MSN & Yahoo. However, do the competitive research and get metrics out of those comparisons, since every marketplace is different. Y & M favor content and care less about links – whereas Google cares about both.
This post is a reminder to look at quality content – always. That means your on-page factors need to be applied correctly.
Here is an onpage SEO list to (re)consider:
This is an example of an on-page optimized page, related to “corporate blogs”.
Content on your page with keywords and links can get you ranked higher in search engines.
The only way a search engine can know what your page is about, is via keywords.
Make sure you apply these techniques, and continue to build external link profiles as well using the same link strategy.
In my recent article, the 6 SEO blogging tips, I speak of the awesome opportunities that exist for you to combine a WordPress blog with some common sense business strategies and tools to get into a positive place for search engine inclusion and ranking.
This is only the beginning. There are more tips you can apply that will help you further. (A recent video from Matt Cutts, Google – is posted at the end).
Directories & filenames with underscores?
We know that directory and filenames are important to search engines. The ongoing discussion around hyphens and underscores was recently discussed at a conference where the below video was taped. Google is looking into how to best deal with underscores, but having an underscore in your URL like this: /sony_digital_camera is normally read by Google as ‘sonydigitalcamera’, whereas /sony-digital-camera (hyphens) is correctly interpreted as ‘sony digital camera’. This last example make it easy for the search engines to read it.
Worrying about this too much may cause headaches for an already established site (possibly losing established rankings) – but if you are building new pages, use the hyphen approach and with keywords in URL and domain, where you can. The (+) plus sign and dots (.) can also be used, but stay away from underlines.
Dynamic pages are good?
If you are dealing with dynamic pages (sometimes called spider-traps) that contain long URLs separated by many ‘&’ characters, you will not be considered optimized for Google. In fact, if you have more than two parameters, you are losing indexing and ranking opportunities. Stay within 1-2 parameters max, something like this: http://www.yourdomain.com?z=parm1&y=parm2 – and you will be treated like a static URL. Dynamic pages are not good for SE’s.
In that recent blog article, I speak of the directory levels that can be set within the WordPress system. I used the date driven structure. The structure creates a deeper level tree, but you can also test using the category page only, and applying a POST Slug (WordPress) to name your directory specifically. (The post slug option is down/right side of the page, when you are in ‘Write’ mode on WordPress). Either way, recent tests show that spiders will drill down and find your page.
Check out the (long) video and see how you can apply further techniques – as instructed from the top (Google)…
Last week, I spent time at the SES (Search Engine Strategies) show in San Jose, CA. Personally, I found the show to be even more energizing than the years before. Not only because of my increased involvement and knowledge of the field of Search, but also because of new friends made, like Bruce Clay, Michael McDonald (WebproNews), Vanessa Fox (ex-googler, now Zillow). Listening to some very engaged search marketing speakers, vendors and customers only added to this feeling, shared by many, I thought.
Outside of the developing areas of mobile search, local search, PPA=”Pay Per Action” (in Beta at Google) video and social search, the paid links session became the biggest, read more
In the community of SEO there is always discussion around who is the best individual or team to craft content and apply website changes for optimum SEO compliancy.
Answers vary, but we never hear much success from the Information Technology or Developer Groups to handle SEO directly.
Reason: They are not good at it, and it’s not their job. Many SEO horror stories come from poor execution and judgement in applying SEO on-page and off-page (linking) strategies and these groups do not understand the 360 view you must have into the SEO world and the search engines themselves.
Is it possible that a lone Microsoft dot NET developer can do a quality SEO job of a specialized marketing person or outside SEM vendor?
Introducing the SEO informer for .NET product, recently announced.
It is a cool and very useful tool for Microsoft ASP.NET developers to apply high quality SEO tactics, right inside the .NET IDE (integrated development environment). Now developers can publish SEO compliant code and websites directly to marketing – SEO done right.
I recently sat down with the creators (Levi Page & Brian Mishler) of this new product. (No affiliation with Larry Page/Google). Here’s what they said:
Q: Who are you, tell me a little bit about yourself and the company?
A: Brian: “I have an aerospace engineering degree from and I’m a self taught software developer. Started with dBASE II, Clipper (Ed Note: I was #4 on the Clipper development team myself) and have a passion for writing commercial software.” Levi: “I have degree in Computer Information Systems, and have been doing software for 15 years. I am really involved in SEO, software & graphics design. I started my software trek with a Tandy TRS-80 and QuickBasic. We have worked together for many years, and are partners in this new business. We both have strong roots in technology, and are very passionate about search engine optimizaton and bringing it out to people.”
Q: How did the idea for this product come about?
A: “We had been thinking about this for a while, and thought that Visual Studio add-ins might be a really useful thing for developers and how important it could be for a tight SEO tool integration with their IDE. You wouldn’t have to leave the environment you love, and work within the ASPX pages directly. Also, we wanted something with instant feedback, no wait time. The controls would tell you what to do next, and you would not need to be an SEO expert. We also looked at other products, like IBP, TrendMX, SEO Elite, and saw no integration anywhere, only stand-alone products. Also, when Visual Studio launched with new concepts of ‘master pages’, and a whole new way of interacting with web pages and sites, we moved away from DreamWeaver (Still a dominating web development product).”
Q: How does the product work?
A: “It uses meta data management among other things. An XML file is updated at the core, and the easy accessible panels allow you to edit directly, without worrying about the underlying data structures. Each node in the file represents a webpage, and at runtime, an ASP.NET control or http handler is used to merge data into a page header. Many developers are familiar and comfortable with XML sitemaps, but here you don’t have to worry about it. On the other hand, you could open the file, and email configuration and meta data to another party, and using the Internet Explorer for example, you could view all the nodes/properties of the pages in the site. Or, just use it directly in the environment itself.”
Q: Who should use this product?
A: “The huge community of non-SEO informed ASP.NET developers, they are finally able use an integrated tool that helps them write pages that is optimized for the web day one. It shows you what to type next, how to layout the page and much more. Plus, it’s only $99.00, and there’s a free download.”
Q: What is the future of this product?
A: “We are always listening to our customers, and we internally brainstorm ideas every week, and collect these – to prioritize tasks/projects for development. The customers tell us what they want, and we typically act on it quickly. We are excited to be part of this industry, but are baffled about by how much money is wasted, and how little people know about SEO and online marketing in general. It’s the cheapest form of advertising, and the old media (traditional magazine advertising, etc) – is not the way to go anymore.”
“You need search engine marketing, and organic listings is your best bet over time. Everybody online needs more traffic, and if done right, it will pay off – everybody needs quality traffic. We are working on a desktop version next, more analysis tools, instant pagerank into pages, inbound link analysis, duplicate content checks, everything rapid response via threading models.”
Thanks guys, I personally installed this product, and found it to be very intuitive, and recommend that my readers take a look, especially if you are a web developer using Microsoft Visual Studio & ASP.NET to build websites and want “instant SEO”…
Search engines love links. In 1998, when Larry and Sergey founded Google, they not only wanted to create a new-new search engine, but a highly relevant, reliable and trusted one. But how?
Altavista Search was all the rage at the time, but the young founders decided that a key metric for their search engine would be based on something new: a core algorithm using links as a measurement of relevancy and ranking among the key formulas. The basis for their emergent PageRank system was that a link to a certain page would yield a particular vote or value from a web page to that linked-to page.
There were, and still are several additional factors that include this value-type system, including anchor text, matching of theme/topic from that incoming link, and semantic attributes of the on-page text surrounding the link, and checks for spammy-looking links, or paid links (a heated topic).
Having said that, you can easily rank for the long tail keywords by making sure that you have TITLE, DESCRIPTION and body text that matches your keywords (1-2) to that page. I ranked my own site, SEO Videos Secrets to the top of Google in 2 days using “seo video secrets” in a link from an external site, no search engine submission (a big no-no!). Long tail keywords are typically not highly searched terms (my example has none), but as an aggregate, can bring relevant (converting) traffic to your site. Try a free tool like http://tools.seobook.com/general/keyword/ to make sure they have search counts, and to find these obscure, but strategic search terms across multiple engines. Then, watch your analytics and make sure they convert.
If you are trying to elevate your listings, build traffic and create a trust network for yourself (you should be!), you can do a lot with your SEO by using linking strategies in combination with the above, creating a better competitive edge for yourself, and a sustaining search engine presence.
Some past examples of highly first page ranked pages *without* content that matches link text are “click here” (adobe acrobat: to download ‘click here’), “miserable failure” (the whitehouse google bomb) and “robot” (the irobot movie with Will Smith, which only had Flash as content).
Your link strategy for relevancy, ranking and traffic should include adding links from the Top Directories, not just partner sites or social networks. You must apply the link from within related categories, as well as ensuring a “clean link”, something like this: ‘<a href=”http://www.your-website.com/” mce_href=”http://www.your-website.com/”>Your Preferred Keyword Here</a>’.
You should also verify that the directory is being indexed by Google. You can do this by the site: command – example - site: http://www.botw.org. (Check the cached data for the latest update, should be very current!)
Additional tips to begin or enhance your link building is found at the Top 101 link building tips website.
Top 10 Directories For Link Love:
Dmoz.org – free – can take weeks, months and years
Dir.Yahoo.com – $299/year – listed quickly and internationally also
GoGuides.org – $39.99 per URL – included right away
JoeAnt.com – $39.00 onetime – about a week to be listed
Gimpsy.com – $40.00 reviewed – if turned down, has a $20.00 fee, commercial
Botw.org – $69.95 per year or $199.95 one time fee – should include
Uncoverthenet.com - $49 or $199 – depending on feature
site-sift.com – $29.95 annual recurring, other program – high pagerank inner values
lii.org - free – high valued, but hard to get, should have lots of hight quality content
wowdirectory.com – typically free, $59 top placement – large directory
There are several others, including article directories (ezinearticles.com, isnare.com) and specific B2B directories like Business.com you may want to check out as well.
You heard it here: “Linking can be fun”. Happy Friday!
Traditional search uses algos & objective properties to provide results – via links from other pages. The next generation search experience – sometimes dubbed the Google killer - is social search – human assisted search results.
There is little doubt that the human powered search engines have their place, and that the user-content driven interaction with websites, and specifically search is important and it is expanding in popularity.
Nothing new though. In fact, Ask.com started out this way, but it had problems scaling up. Yahoo started in 1994 with their now infamous directory and it was all about human interaction & validation. Microsoft also had human editing of results, but once Google took the world over with their link based system, and it worked so well – other engines were left in the dust.
Today, Google, considered by many as only an algorithmic search engine, has been, and still is using humans to stay close to search results, and looking to add more. They know the combination of both are key to the success of the “Web 2.0″ phenomenon.
Whether the short list below will provide a v1.0 testbed for the “next big search engine model” – beating Google at its own game, still remains to be seen. We all know that Google didn’t get to #1 by slacking off.
Brin & Page, the founders of Google are fiercely competitive, and their top priority has always been to provide the ultimate in search relevancy and user experience, and I don’t see them taken over any time soon.
A brand new human powered search launched by the founder of Wikipedia, was just launched. (http://search.wikia.com/wiki/Search_Wikia). It’s getting lots of press and since it’s based on open source crawling and has a large existing and growing user base, it will be an interesting development to watch.
It’s all about the user community, and unlike the ridiculous editing experience of directories like DMOZ, it’s refreshing to see this open dialogue and debate. The Wikia crawler can be seen at Grub.org, where you can download it for your own use.
Other social search tools that you should check out are:
There is an area of the Google backoffice support system that is unknown to many learning about SEO. And, as recently as last week, Google launched an update to this system, and they want to talk to you.
It’s also the easiest and fastest way you can get your site synch’ed with Google, and where you can get statistics and feedback directly from the Google search engine — from crawling and indexing updates, error and verification issues, sitemap submissions.
Furthermore, it provides a detailed reporting system and expanded lists for your backlinks (link popularity), which normally is not shown via the Google searchbox. (That is the ’link:http://www.yourdomain.com’ command)
Go read at: http://www.google.com/support/webmasters/ & https://www.google.com/webmasters/tools/docs/en/about.html and click the “Go to webmaster tools” link on the right of that page to sign up.
Understand that you must have access (FTP or otherwise) to your site to make it “verify” (a process that lets Google know its your site).
Sitemaps, both on your site (HTML) and via the Google Webmaster Console (XML) is an easy and important way to manage your site with Google.
This is true for existing sites, new sites and it is an awesome tool for dealing with moving domains and updating Google correctly in the process.
You need to build your XML sitemap to submit.
A great tool is http://www.xml-sitemaps.com/ for starters, and you should read more about the initiative from the giants at http://www.sitemaps.org/ (Google, Microsoft, Yahoo).
Reading the webmaster guidelines on all the search engines is important if not vital to ethical SEO success, and they keep updating the information.
How do you keep track of webmaster changes?
A cool tool to keep yourself updated when things change is: www.google.com/alerts — just input keyword/phrases you are watching out for, and you’ll receive an email message weekly, daily or as-it-happens. (Try adding your own name too!)