Internet Without Google - Story by Rob Ousbey

Written By M. Anandhan Mudhaliyar on Saturday, December 10, 2011 | 2:42 PM

Saturday, December 10, 2011

I’d like to take a couple of minutes to take a look at the past and the present. This post won’t teach much about SEO; it’s just a hypothetical look at an alternative universe, where the early days of search went a bit differently. (Is ‘search engine fanfiction’ a genre?) Here goes….

A Story - The Internet Without Google

In the mid-1990s, two Stanford grads – funded by various federal councils – were working on a project to apply their understanding of network topography and the principals of academic citation to Internet. As Internet had become more accessible to the public, the number of people browsing it – as well as the number of people generating content – had increased with ‘hockey stick’ like growth. Finding content was a matter of either knowing a particular domain that should be visited, visiting a trusted directory where links were categorized and curated, or using one of the new ‘Internet search portals‘.

The search portals (sometimes referred to as ‘search engines’) were technologically impressive, with millions of page available to search across, but they were unreliable: the results were usually based on little more than whether a user’s query term appeared prominently on the page. The Stanford project – Backrub – had taken a different approach, using the principle that pages to which someone had created a hyperlink – whether from a bulletin board, a news group or another site on Internet – were likely to be more trusted, and so should appear more highly in search portal results. Other signals, such as the topic of the linking page, were used to calculate the target page’s relevancy to a particular keyword.

Backrub was first tested with content from almost five thousand separate sites – and it worked well. The results were constantly appropriate and were still calculated in a fraction of a second.

With this successful proof of concept complete, the two engineers wrote up their analyses and results, and moved on from the project. They received their Ph.Ds later – but their creativity had attracted attention within tech circles: one of the students accepted a job with a successful software company from the Pacific North West; the other left the US to join one of the largest industrial technology companies in his home country.

The Backrub technology was recognized as useful, but Stanford had no resources to scale it further or Ph.Ds interested in taking it on. After being shelved for almost six months, the technology was passed on to the Department of Information Technology and Telecommunications; many of the underlying processes were successfully patented soon after.

Next Steps

The DoITT was a new agency, and had been assigned various (poorly specified) responsibilities pertinent to Internet; the mandate to ‘improve the user experience of Internet for all citizens’ combined with a substantial budget allowed them to scale up the technology. In the first large crawl in 1996 they retrieved over a hundred million pages from almost half a million sites. These were taken from a seed list of sites listed in two of the largest editorial directories, after around eight percent had been removed (DoITT’s guidelines prevent them from linking to adult material or ‘illegal content’.)

After this first large crawl, the frontend was released at
Although it wasn’t heavily promoted, the service was popular and mainly recommended by word-of-mouth. In March 1997, less than six months after the unofficial launch, a New York Times editorial said that “the DoITT had restored faith that the Government is capable of innovative technology, after the fiascos of various bloated IT contracts for the Department of Defense. This portal was a triumph.” The Washington Post called it a “huge success.”

That year, three positive factors all combined: positive feedback from users (who mostly described it as ‘the first portal that actually finds what I was looking for’), further reviews in the press (who were happy to have a story about Internet that could be understood by the general public) and increased promotion by the government (who were proud to promote a service that demonstrated America’s cutting edge presence on Internet.)

Hugely accelerated growth followed: later that year, one ISP estimated that 19% of homes used the service more than once a week.

By early 1998, users had already seen a couple of significant changes to the service. The service had been spun off from the DoITT, and found a new home at http://Search.Gov
Due to the increasing popularity, the service now also required free registration to access search results: this only needed a verified email address, and the registered account could be used on any computer.

Various issues took place behind the scenes as well: a young internet company tried to diversify from being a pure Internet directory into offering search portal facilities as well: they used a similar algorithm to the original Backrub service, whilst other companies also tried to tackle the ‘Internet search’ issue from different approaches. The threat of prosecution for infringing upon the Governments various patents proved too much for these startups, and they generally didn’t pursue this any further.


A team inside Search.Gov were responsible for managing ‘user satisfaction’ – this included analysis of various user behavior patterns (such as the number of searches to find a particular result, which they correlated to a worse user experience) and the manual review of ‘search quality reports’ which users could complete via a form on the site.

An overwhelming problem for this team – and for Search.Gov as a whole – was the increasing rate of growth of content published on Internet. Companies were increasingly putting their whole catalog online, news organizations would publish their best articles on a website every day, and new tools made it increasingly easy for the public to publish small sites of their own, usually on webspace provided by their ISP. Many of the percieved issues of quality percieved by Search.Gov were due to new sites not appearing in the results – the Search Quality team had to manually review sites that had been discovered or submitted, and were currently sitting on a backlog of around three months of submissions to process.

(The site tried to solve this by removing the manual review stage, and automatically adding every newly discovered site to the index. The idea was that sites which containted adult or illegal material could always be removed again later. This worked well for around six weeks, until users began to complain of increasingly irrelevant results, poor quality sites full of gibberish, and the increasing liklihood of pornography appearing in search results. Search.Gov quietly removed the changes, and returned to manually reviewing sites, but with additional resource and a better process. Sites that were permitted generally arrived in the Search Index within 3-4 weeks after having been submitted or recommended by an internal algorithm.)

Creative Funding

The increasing cost of running the service (both from technology and human resource) was of concern to various parties within the Government, as well as Conservatives who believed that this shouldn’t be a federally funded operation. Some web directories and similar services had seen success in selling targeted advertising on their listings pages: this was much more effective than traditional display advertising, as a ‘premium listing’ or ‘sponsored listing’ could place a website’s link prominently in the appropriate category and attract people who were looking for precisely that type of site.

While Search.Gov had looked into including adverts and premium listings on results pages, it was deemed inappropriate for a Government service to be supported by advertising.
“H.R. 1337 – Seach Portal Funding Bill” was designed to tax companies listed in Search.Gov based on different criteria. It was argued that the tax would stymie initiative from small companies and would completely remove non-commercial sites from the results – the bill was never passed in the House.

However, a funding opportunity came from a much larger source – overseas. Search.Gov had recognized that the number of registrations from international users was increasing. A suggestion to raise revenue by formarly licencing the technology to other countries saw very little objection from within the Government or in the press. The United Kingdom and Austrailia were the first countries to launch bespoke sites (at Search.Gov.UK and Search.Gov.AU) for a fee (technically a technology export tax) that equated to roughly $13 per citizen/year. Germany and Japan followed around a year later, after the system was extended to cope with non-English sites.

The governments of these countries could adjust certain parameters of the algorithms to better suit their audience: the most significant change was usualy to show a preference for websites from that country, and to block particular sites that carried content which was illegal in their country.

The steady stream of revenue from these licencing arrangements ensured that Search.Gov was self-sufficient; it was allowed to use a majority of the income to improve and extend the service, whilst a portion was passed up the chain into the Federal budget.

After the technology was passed overseas, it became necessary to restrict the use of Search.Gov to residents of the USA. This was implemented by having users associate their Social Security Number with their Search.Gov account. Since this allowed for the association of an individual’s real identity with their Search.Gov account, it naturally caused some controversy, particularly amongst those concerned about Search.Gov’s increasingly close cooperation with the FBI & National Security Agency.

However, the criticism was mainly silenced in early-2000. A school teacher was arrested for sexual abuse of his students; later analysis of his search history showed that he had regularly looked for lewd and inappropriate content. A bill was quickly passed that year, allowing a review of an individual’s search history when they apply for any government-based job. (The individual’s permission was required, but this was mandatory for most positions.)

In response to the 2001 terrorist attacks, the President created special new powers for the security agencies to increase the amount of search data collected and deep forensic analysis undertaken. Full details were never released, but it’s understood that as well as looking for general search patterns, it was now permitted to review the search history of any individual suspected of a serious crime.

Search Portal Optimization

During these years of growth for Search.Gov, a whole new industry was spawned: Search Portal Optimization, or SPO. SPO was the practice of optimizing sites to rank as highly as possible in Search.Gov for the appropriate phrases. While many sites paid little attention to SPO, some sites worked hard to optimize their sites high into the rankings.
SPO Consultants typically referred to four broadly separate stages of the SPO process:
  • Ensuring a site was easily browsable by search portal robots (by using HTML standards, rather than Flash, etc)
  • Applying for the site to be included in Search.Gov and appealing any rejections
  • Identifying the search terms the site should rank for, and creating relevant content
  • Building hyperlinks to the site from external sites.
Search.Gov did not release any data about the most searched for terms, but SPOs could get access to some data from ISPs who were able to estimate the Search Volume for some larger terms.

In addition, the Search Quality team rarely gave much feedback to sites that were not accepted into the portal’s index. In the early days, discussion centered around sites that were blocked for carrying information about homemade explosives or promoting drug use. Some sites were removed from the index for carrying illegal content: a retailer that sold radar jammers for automobiles was removed after three months.

An Internet Log (‘netlog’) that was critical of the Government’s approach to farming subsidies was not accepted in; the New York Times covered this controversy, and was also removed from the index. Two days later, both sites were available in the index: a Search.Gov spokesperson said “this had been unintentional, and was not an attempt to remove content from Internet. We are committed to fostering free speech online, and to a free and open Internet.”

(A Freedom of Information request showed that the number of sites not accepted into the Search.Gov index increased from around 10,000 in 2000 to 198,000 in 2008.)

Link Building

Once in the index, sites would work hard to improve their SPO, to try and rank above their competitors. The main focus of many SPO experts was link building: getting more links from more sites would generally result in a better ranking. Search.Gov did publish a set of Webmaster Guidelines that explicitly mentioned that links should not be created purely in order to manipulate the Search Algorithm – but this did not put off the so-called ‘blackhat SPOs’, who aimed to get high rankings for their site, at any cost.

This lead to something of an arms race between Search.Gov and blackhat SPOs: each time the blackhats found creative new tactics to create sneaky links online, the Search.Gov quality team would devise algorithmic methods to remove the impact of these links from their calculations.

However, the secrecy around Search.Gov’s algorithm, and the amount of authority passed through each link meant that webmasters/SPOs/link builders were never sure which links were passing value and which were not. Comment spam, low quality directories and other questionable tactics continued unabated. A particular issue for the search quality team was paid links. While some paid links were easy to detect, other were much only obvious when reviewed by a human. Rather than discounting links, Search.Gov used ranking penalties to dissuade sites from purchasing links.

A typical penalty would be when a page with many inbound paid links would have its ranking position dropped by 20 or 50 places – webmasters would have to remove the offending links to see that page start ranking well again.  For more serious infractions of the guidelines, a whole site could be removed from the index for a month or two. (Although users could still navigate directly to a site, the lack of non-branded traffic would decimate a commercial site.)

Despite the risk of page and site penalties, many blackhat SPOs were still not put off from creating low quality content, and building links that violated the Webmaster Guidelines, since they would usually dump a penalized site, and work on another one else instead. In 2007, after increasing public complaints about the declining quality of Search.Gov’s results, the service was able to have parts of the Webmaster Guidelines adopted as law. The first criminal trials against SPOs were held in February 2008: the two owners of a credit card comparison website were charged with the Class C Misdemeanor of ‘Intent to Mislead a Search Portal Algorithm’ for their paid links and each served 30 days in jail.

Algorithm Updates

The changes to Search.Gov’s ranking algorithm were not always subtle. Various substantial changes were implemented, typically to try and tackle a specific type of search quality problem, such a duplicate content, content farms, etc.

One large update in March 2009 was seen to promote the sites of big brandstobacco industry lobbyists were successful in getting cigarette brand websites included in the index. (Until that point, they had been excluded on the basis of the broader laws about advertising tobacco products.) higher up the rankings; it was likely that this change came in response to a request by the newly elected President, who had received huge campaign contributions from large corporations in the election. Many small businesses said this pointed to the ‘increasing politicization’ of Search.Gov’s algorithms. The impact of lobbyists on these decisions was seen even more clearly in 2010, when

An algorithm update in 2011 appeared as if it was designed to reduce the impact of ‘content farms’. One notable side-effect of the change was that ‘scraper sites’ increasingly appeared to outrank the original versions of content they had republished. Shortly after this, Search.Gov assisted in the creation of laws that brought unauthorized content scraping within the formal definitions of grand larceny. (It was so far seen as simply copyright infringement.) Republishing another site’s content without their consent was now punishable by up to a year’s jail time. Although there were some prosecutions, it wasn’t particularly effective at stymieing the number of scraper sites hosted and owned overseas.

International Trouble

Search.Gov was facing some bigger issues, as far as its international relationships went.
By 2011, Search.Gov had licensed the technology to 44 countries. In May that year, two Turkish citizens arrived in the US, and were immediately arrested on charges of terrorism. Although the US had no access to information about the websites they’d visited directly, the search history of one of them had raised flags at the NSA, which lead to further investigation and ultimately their arrest.

Foreign governments and citizens were not enthused by the idea that Search.Gov technology was being used to track their Internet searches. The patents that prevented independent Search Portals launching in the USA were not recognized all over the world, and a number of commercial companies ran similar services – particularly in Europe and Asia. In the weeks after the Turkish controversy, the market share (outside the US) of services built upon Search.Gov dropped from around 95% to 90%. The number declined further over the next twelve months as the largest commercial Search Portals reinvested their profits in improved indexing and ranking technology. By August 2011, three countries (Turkey, Germany and Brasil) announced that they had ceased using Search.Gov’s technology. All three counties allowed competition in the Search Portal market, though the German Government maintained www.Suchen.DE, which was now powered by technology from a German company.

In late 2011, Search.Gov suffered a series of ‘cyber attacks’, which seemed to originate from Russia and China. (Neither country used Search.Gov technology for their own services.) Although the attacks may have been an attempt to hack the site and steal personal data, Search.Gov announced that no data had been accessed. However, the site suffered significant periods of downtime, initially for three hours, then a week later for six hours. Various rumours suggested that the US believed China’s government were behind the attacks – this was never confirmed by the Department of State.

The Future

Despite some controversies, Search.Gov has been a solid technology for users in the USA and abroad for more than a decade and a half. It’s now used by over 90% of US households on a weekly basis, though most people use it much more than that: the average user makes 9.5 searches per day on Search.Gov.

Search Portal Optimization continues to be serious business: various research suggests there could be up to 200,000 people employed in the US to perform SPO – either directly for a company or through an agency.

The Search.Gov technology continues to evolve: new content is now discovered more quickly, and the results continue to improve. The organization currently employs around 10,000 engineers and operations staff within the US, and another ~18,000 full and part time employees responsible for assessing the new sites for submission to the index. There are around 600 employees of Search.Gov stationed around the world to work with the teams in countries that licence the technology.

Search.Gov is the ‘first page of the internet’ for most people. I don’t know what we’d do without it….


OK, let’s step back to the real world. This thought-experiment focussed very much on what could have happened in my alternate universe.
What’s also interesting are the things that might not have happened:
  • without the necessity to create PPC, internet advertising may never have really evolved beyond traditional banner ads
  • no AdSense means that people may never have bothered creating/maintaining many smaller content sites
  • innovations such as Gmail and Google Analytics wouldn’t have appeared
  • with no-one to acquire them, YouTube wouldn’t have been able to grow so fast. (In fact, with their large amounts of copyright material in the early days, would they have even been allowed in the index?)
  • newly wealthy employees wouldn’t have left to create sites such as Twitter, Foursquare and FriendFeed.
If you got this far, thanks for reading. I enjoyed the opportunity to try writing some long-form content.
If you’d like to share your thoughts on this potentially very different online landscape: you can leave a comment below.

Rob Ousbey for Interesting Article.
Read More | comments (3)

SEO Tips: Seven things to do at weekend

Readers, the following is a guest post from Rob Ousbey of Distilled

For people who do SEO in-house at a company, or for people who run their own websites/businesses, this can actually be a valuable time to do some work out of the office. You can focus on tasks (big and small) that you just don’t tend to find the time to do when you’re in the office, and you might even find that your creative juices are flowing particularly well after stepping away from the ‘snow-blindness’ that a 9-5 routine can deliver.
With that in mind, I’ve suggested a few different tasks below. If you have a few minutes or a few hours over a weekend or the holidays to do something work-related, and would like to achieve something more productive than just checking your email, then one or two of the following might provide a useful diversion.

1: Double Check Your Sitemaps

Task: Parse your XML sitemaps to make sure you’re not pointing to any URLs that 404 or 500.

Why: Bing has talked publicly about how having more than 1% ‘dirt’ makes them lose trust in your sitemap. Google are likely to take a similar approach.

How to: You could co-opt your whole family to spend an afternoon checking pages from your sitemap. As an alternative, Mike King (@IPullRank) has created a nifty tool – the Map Broker – XML Sitemap Validator - to do all the work for you. Submit the address of your sitemap (or upload the file) and check up on the errors. Save the output for when you get back to the office, and you have time to start fixing the URLs in there or removing the errors.

Time Required: From what I’ve seen, I’d allow about 5 minutes per 100 URLs. The length of time that it’ll take you to fix any problems depends on how good your data quality was to begin with!

Map Broker by iPullRank
404 and 500 errors are particularly bad, but do look out for 302 or 301 redirects as well.

2: Reclaim Some Link Strength

Task: Find any pages where people are linking to URLs on your site that 404.

Why: Not only will Google will be wasting crawl allowance looking at useless pages, but you may be missing out on link juice from people who have mis-linked to you.

How to: In Google Webmaster Tools, click on ‘Diagnostics’ -> ‘Crawl Errors’. Clicking the link on the far right of each line will show you which pages are linking incorrectly. If there are any internal links listed here, keep a note to work through the list later to fix each one. For external links you can decide whether you’d prefer to contact the site to ask them to ask them to correct the link, or to 301 the target URL to a more appropriate page. It’s worth mentioning that in the process of reaching out to people, you may even be able to use the opportunity to strengthen your relationship with them – which could be really valuable for the future.

Time Required: 5 minutes to log into WMT and to start parsing the data. 5 minutes for each site you need to contact, to find a contact email address and send them the email.

3: Add a New String to Your Bow

Task: Find one new head term that you want to rank for

Why: There’s very little risk to targeting a new term with your site, and you’ve got an opportunity now to try something that you may not otherwise have dedicated any time to.

How to: You may well have a variety of terms that you had in the back of your mind that you might want to target. For this exercise, you should look for something that’s going to be reasonably hard to rank for, but could bring a worthwhile amount of traffic to your site. Use your favorite keyword tool (I’d just go straight to Google’s Keyword Tool) to find a term that could be among your 10 biggest non-branded keywords if you ranked on the first page for it. If you’re an SEOmoz user, I’d use the Keyword Difficulty Tool to make sure that the term you’re going after is at least a 40 or 50% difficulty term.

Write content that targets the term, grab an appropriate image, and get a page up on your site with a targeted URL and title. Bear in mind the question “does this page deserve to rank in the top-10 for my keyword?” If not, think about what additions to the page would make you answer ‘yes’.

Link to the page with exact anchor text from the home page of your site, and encourage a few of your friends or fans to tweet about it. The page isn’t going to rank #1 straight away, but leave it to get indexed by search engines and see if it shows up anywhere. When you’re back in the office, think more critically about your off-site strategy for this page.

Time Required: 30 minutes research; 1 – 2 hours to build the page; 30 minutes to promote it to a few relevant people.

I'm all a quiver. Photo CC licensed by Seriousbri

4: Give Your Videos a Fighting Chance

Task: Create a video sitemap for your site

Why: Video sitemaps are still incredibly effective at getting your videos indexed by Google, and giving you the opportunity to have blended video results in very competitive SERPs.

How to: Depending on time allowed, I’d suggest we get this up and running as a proof-of-concept, rather than a full blown video sitemap system right now. As such, don’t worry about an automatically updating system with all the bells and whistles. Start by collecting a list of the videos on your site (including: their titles, the page they’re embedded on, the actual location of the video file.) Look at the video sitemap format specifications on and figure out how you’re going to produce a file in that format with your data. (You could do some ‘copy-and-paste’ work if there’s only a modest number of videos. Another basic solution might be to create a sheet in MS Excel to do this for you. I’ve not been impressed by any of the various ‘video sitemap generators’ that I’ve seen online; I’m very open to recommendations.)

Once you’ve created the sitemap: upload it to your server, add a reference to it in your robots.txt and submit it to Google via Webmaster Central.

Time Required: Depending on the number of videos you have, it could take anyone from a few minutes to a few hours to collect the video data, and could take a similar amount to time to create the sitemap. If you have huge numbers of videos or don’t have easy access to video data, consider just spending 30 minutes on creating a video sitemap for a dozen videos and get this up to see if you can start by getting blended search results for those videos.

5: Appeal to a New Audience

Task: Write a blog post about something that will appeal to a very different crowd to your normal audience.

Why: The best reason to try this is to widen the potential group of people who might link to your site or promote the content on niche sites.

How to: It can be hard to pick one niche from the huge variety of people that fall outside of your typical audience, so focus on something ‘non-core’ that you’ve done recently that someone might find interesting. By way of examples: could you write about some aspect of your current website that would be interesting to Hacker News readers? (One non-technical client wrote about a custom Drupal module they’d created; the post did well on Hacker News, the Drupal forums and on other developer sites.) Could you write a post about how you designed your logo, and get it featured on Design Float? Could you write about tips & tricks you use to be more productive, with a view to coverage from sites like Lifehacker?
Do be aware of how your regular blog readers may respond to an off-topic post. If they’re interested in seeing another side of you, then that’s great. (We get good responses to the occasional non-SEO post on Distilled; SEOmoz are known to go off-topic from time to time.) If regular readers are likely to be unhappy about you going off-topic, consider publishing this type of content on a static URL or in a separate section of your site.

Time Required: How long does it take to write a blog post? You tell me. Give yourself a deadline (eg: 30 minutes for idea generation, 90 minutes to draft the post, 30 minutes to find a handful of prospects to promote it to) and see how you get on.

6: Mine your Mailing List

Task: Use a service such as Qwerly or Fliptop to gather intelligence on your contacts

Why: You can look at the email addresses in your customer database or on your email subscription list and identify the people that could be most useful to create closer relationships with.

How to: I used to espouse using Qwerly; the company has recently been purchased by Fliptop, so I don’t know what the long term plans for the Qwerly API are. I’m investigating Fliptop at the moment, and will write about that in more detail when I have something to report! With both services, you’re able to parse a list of email addresses, and get back data including the Twitter & other social media accounts associated with each email address; it will cost a small price per contact, though this can add up with thousands of contacts.

With this information, you could find the customers/subscribers with the highest follower count or Klout Score. These people would make great targets for outreach to promote content or new products. If you have limited time over the weekend, you could at least send a couple of personalized emails to influential contacts from your lists; thank them for being a reader/user/subscriber/customer and perhaps offer them something special.

Time Required: Getting up and running with a new API can sometimes be a struggle, and will depend on your tech ability. However, Fliptop does appear to be working on ways to make it easy to import your contact lists directly (from tools such as MailChimp, ExactTarget, ConstantContact and Salesforce) which may make the process much faster.

It takes an API to make me happy.

7: Impress the Linkerati

Task: Identify three influential bloggers or writers that you’d like a link from and do something to impress them.

Why: This is the kind of task that’s easy to overlook while you’re in the office; let your creative juices flow and do something out-of-the-ordinary.

How to: Though there’s plenty to say about identifying influential people in a particular space, I’d hope that you already have a good handle on who the influential writers in your niche are. If you need some inspiration, think about the blogs that you read (or the ones that you read when you were just getting into your industry.) Alternatively, have a look if there’s already a curated list of bloggers on Alltop; take a look the bloggers listed at etc. Use their search feature, or I’m sure you can figure out the URL format there ;-)

What inspires your three favorite bloggers to write a post? What impresses them enough to share something? You don’t need to execute on this straight away, but try to come up with a suitably long list of ideas and start whittling it down. Do you want to follow WePay and drop a 600lb diss outside your competitor’s conference? Or like Nordstrom, and insist upon ‘one holiday at a time‘? Or Game of Thrones and release a 2 minute ‘production trailer’ on TV and online to start building buzz.

Time Required: Give as much time as you want to thinking about this. Think about it in the shower. Think about it when you’re falling asleep. Think about it when you’re reading the blogs you want to get covered on. You might suddenly have a startlingly great idea, or you might create enough average ideas that you can discuss them with the rest of your team to iterate and create something really effective.

Enjoy your holiday season and all the long weekends. Have fun with your friends and family, and don’t work too hard.
Read More | comments (6)

Googles Both Social Networking Sites: Orkut and Google+

Written By M. Anandhan Mudhaliyar on Saturday, July 9, 2011 | 3:23 AM

Saturday, July 9, 2011

Google’s other social networking site, Orkut, which has been around for about seven years and has tens of millions of users worldwide, will continue to operate alongside the new Google+ for now.

However, Google is leaving its options open regarding the possibility of fusing the two through some sort of integration further down the road.

“Orkut and Google+ are different products, and will both exist. Over time, we’ll determine what makes the most sense in terms of integrating these products,” a Google spokeswoman said via e-mail, when asked what the future holds for Orkut.

At this point, it’s not clear whether Google will provide a mechanism for Orkut users to export their friends list and profile data to Google+, or add Google+ privacy features to Orkut.

With Google+, the company is hoping to make a run at Facebook with a social networking site whose privacy features are tighter and easier to configure.

Google is betting that while Facebook is the dominant player in social networking, enough of its users feel uneasy about its privacy controls that they would be willing to switch to a rival that offered them an improvement in that area.

In its design, Orkut, launched in 2004, is generally similar to social networking sites of that generation, including Facebook, MySpace and Friendster.

Orkut ranks in the 102 spot in’s list of most popular websites in the world, but most of its users are concentrated in two countries: India and Brazil.

As the Google+ project shows, Google clearly felt that it needed to create a new social networking site from scratch in order to give Facebook a run for its money in the U.S. and around the world.

It will be interesting to see if the company opts to continue operating the two sites in parallel in the long term, or whether having Orkut becomes strategically contradictory as Google preaches its new privacy approach with Google+.

The good news for users of Orkut is that there seem to be no imminent plans to shut it down. “We will continue to invest in the product,” the spokeswoman wrote.

Follow me @ Twitter | Facebook | Linkedin | Digg
Read More | comments (2)

Quality Directory Submission Process

Written By M. Anandhan Mudhaliyar on Thursday, July 7, 2011 | 4:20 AM

Thursday, July 7, 2011

There are a lot of advantages to directory submission for search engine optimization. Primarily, directory submissions result in one-way links to your site. In addition, if it is a high traffic directory, you can also get visitors to your site through the directory, though I would not count on this kind of traffic.

Of particular value are specialized directories that are related to your site because these provide relevant one way links back to your site. The other big advantage is that as you become listed on high profile directories such as DMOZ, it improves your site’s credibility and facilitates other link exchanges because people are more willing to work with webmasters and sites that are well connected and have a good reputation.

Below are 6 tips to help you get the most out of your directory submission process.

1. Keep a Log of All the Directories You Have Submitted To

Create an Excel sheet or get a lined notebook to write down the directories that you have submitted your site to so you don’t accidentally waste time doing it twice. Plus, this gives you a long list that you can check off in the future when you have other sites to submit.

Also, while I generally don’t recommend paying for directory inclusion, keeping records will help you to know how much you are spending and better decide if it is worth the cost.

2. Compile a List of Directories Before Starting

There are thousands of directories out there. Fortunately, others have already compiled lists of directories for you so that all you need to do is search out these sites and sort their lists by high page rank and the niche directories related to your website. Start with these and, if you still have time afterwards, then expand your list. With literally thousands of directories (most offering little value), however, you want to start with the ones that are likely to give you the most success.

3. Choose the Right Category

Selecting the most relevant category is of key importance in directory submission. With every directory, there is a webmaster deciding whether your site is going to contribute to their business. If your site is not going to add value to their directory, then your submission is on a fast track to the garbage bin, and the fastest way to end up on this track is to submit to the wrong category. In addition, while not all directories allow you to submit to high level categories, if it is possible, you want to be on pages with the fewest clicks from the home page. The closer you are to the home page, the greater value that link back to your site will be.

4. Build Your Title Wisely

Most directories allow you to build a title for your site. This title becomes the hyperlink back to your site and so the keywords used in the title will be the keywords that Google deems important on your site. If you want your blog to rank well for ‘Latest SEO Tricks’, then there is little point in titling your submission “Latest SEO Tricks Blog.” Once you have built a keyword relevant title, you will then want to vary it between submissions so that the link building appears more natural to the search engines.

5. Build a Keyword Relevant Description

While the description is less important compared to the title, the keywords help to determine the relevance of the link and also act as your pitch to potential customers browsing the directory. Write three short descriptions (15-25 words) and three longer descriptions (30-50 words) and vary them with each submission depending on the guidelines of the directory. Also avoid excess use of capital letters and punctuation as overt overuse of both can get your submission rejected.

6. Stretch Out Your Submissions Over Time

The search engines do not like to see abnormally large bursts in the number of links to a site as it generally suggests that artificial link building is being done. The search engines want natural link building and the more natural you make it look, the more value you will get out
of your links.

By following these tips, you should be able to enjoy a good return on your time investment into directory submissions.
Read More | comments (7)

Effective Backlinks Approach Tips

Hello friends, we all are well versed with getting and sustaining a worthwhile amount of traffic in your internet marketing business is never easy and can certainly take up some considerable amount of your marketing time, certainly for my friends it’s a big chunk of workload.

One way to improve search engine ranking is through link building. Link building is important and so should be done on a regular basis to facilitate your rise in the search engine page ranks. You can bring (or link to if you like) a reader directly to your website by using appropriate keywords in the anchor text of your article, just always make sure your keywords incorporate what you have already researched that readers in your niche are searching for.

Remember Targeted Traffic is Vital

Some website owners have devised methods to get their own back linking from subsidiary websites created by them. They have built other websites and back linked them to the primary website which also has links to those websites.

This is known as a link wheel. The link wheel can be as large as you can make it but do bear in mind its takes time and there needs to be synergy between the various sites – you want & need targeted traffic not as an example weight loss visitors coming to a gourmet recipe site brimming with lots of high calories recipes

It’s all a learning Curve – Don’t sweat it

When I first started my Internet marketing business, I didn’t fully understand the concept of back linking. I knew about submitting articles to directories which I did but my primary objective was to get known in my niche and to get traffic back to my website. I then had a light bulb moment and realized that by submitting articles I was actually achieving both and improving my search ranking as well – it’s amazing how dumb you can be sometimes!!
Certainly, made me feel a bit stupid but hey we learn by our mistakes don’t we.

The Old Favorite!

Apart from article marketing, there are other ways to get backlinks to your website. One of ways is forum posting. Forum posting helps you get backlinks because by commenting and offering helpful advice many of the visitors who read and devour those posts will over time come and visit your site because you have offered up some useful & worthwhile help with your responses to either them or others on the forum In other words you become a bit of a trusted source, you’ve helped others – get what I mean?

Get on The Forums in Your Niche Now

Forum commenting and posting takes time but is easy and you’ll soon discover that although you may consider yourself a bit of a newbie you’ll soon find there are others out there that are far newer to internet marketing than you.

For you to post on a forum, you would be required to register as a member – its easy There are two ways to benefit from forum posting. The first is using the signature file.

The signature file gives a place to put your link and always appear beneath each post you make. The second way is to put a link in the post itself Now a word of warning here many forums don’t allow that and will ban you – so the golden rule read the regulations for each site you sign up to and adhere to their rules and stay within the regulations.

A key to successful back linking through forum posting is to contribute helpful information while making a post. That way you are not seen as an opportunist and you become a trusted helpful source.

Become Regular Blog Commentator

Blog commenting is also a way to gain backlinks from other websites. When you post a comment on a blog that is related to your niche, you can get backlinks from there. Comments on blogs are often moderated by the Webmaster, so your comment must be useful to the blog don’t for heavens sake just say “useful info – thanks” that’s just taking the micky, be creative.

Some webmasters contact website administrators directly, asking for a backlinks from them in exchange for a link from their own site. Although this method does work at times it is not always reliable because not every website administrator will maintain their site and therefore your back link

Beware & remember ‘Targeted’ another way to gain back links is through link exchanges. Through link exchanges, you get your links in other people’s website while you do same for them by placing their links on your own website.

The disadvantage of this is that you may have a lot of links coming from unrelated websites which the search engines don’t really like – so I’d be careful with that.

You can also use Press Releases as a way of building back links. Press releases can offer up the same affect as article submissions when used properly but do bear in mind a press release needs to be timely and news worthy so. The major difference with articles is that press releases need to be in the form of news flashes. When submitted to PR directories, they return a back link to your website and this is even better if the press release is taken by other sites as again you’ll get a back link from them as well.

So there you have it some simple, no cost (apart from your time) methods to get you started on your link building strategies, just try them out and remember its not necessarily about volume its doing it consistently and regularly.

Post written by Toby Russell
Read More | comments (10)

Microsoft launches Office In The Cloud

Written By M. Anandhan Mudhaliyar on Friday, July 1, 2011 | 4:34 AM

Friday, July 1, 2011

Called Office 365 the service puts the familiar e-mail, word processing, spreadsheet and collaboration programs on the web.

Microsoft said the programs will be accessible via desktops, laptops and tablets plus Microsoft, RIM, Apple and Android smartphones.

The launch is aimed squarely at Google and others who already offer web-based business software.

Cash cow : Office 365 is being formally launched on 28 June via events in New York and London. The service unshackles the well-known programs from a single PC and translates them into a web format.

Charges for the service are based on the size of the business that wants to use it Small businesses with fewer than 25 employees will pay £4 per user per month for secure access to e-mail, calendar, documents and contacts.

Larger organisations will pay from from £6 to £17.75 per user, per month and get a broader range of extras including advanced archiving, unlimited storage and Microsoft's Lync messaging and communications system. Customers using Office 365 can host the applications they are using in Microsoft's data centres, use dedicated servers in those centres or put the programs on their own hardware in their own data centres.

Office 365 takes the place of Microsoft's current web-based offering for firms known as the Business Productivity Online Standard Suite. Office 365 stands separate from the web versions of Office which features cut down versions of the familiar programs.

The move to the cloud is seen as a gamble by Microsoft because much of the cash generated by Office comes from sales of software installed on desktop PCs. Switching to the web could dilute this cashflow which is responsible for about one-third of the company's revenue.

However, a web option is seen as essential in order to combat the growing threat from Google and others that are starting to poach Microsoft customers.

"Windows and Office are the two foundations of Microsoft's profitability and this is kind of messing with one of them," said Jeff Mann, a VP of research at analyst group Gartner. "It's definitely a very big bet."

Before the official launch of Office 365, Google put a post on its Enterprise Blog comparing its Apps service with Microsoft's offering.

Shan Sinha, Google Apps product manager, wrote that it was better to start with a new technology rather than add extras to an ageing one

"Technology inevitably gets more complicated as it gets older," he wrote. "Upgrading platforms and adding features results in systems that are increasingly difficult to manage and complex to use."

In the blog post he runs through the differences between the two services, saying that Google Apps is about teams, the web and choice but by contrast Office 365 was for individuals, desktop PCs and other Microsoft-specific technology.

"You can't just take legacy, desktop software, move some of it to a data center and call it "cloud."," he said. "Apps was born for the web and we've been serving hundreds of millions of users for years."

Other online business software suites are offered by other companies including Zoho, VMware, IBM and

Source taken from
Read More | comments (2)

Local SEO with Google Places Page

Like most small business owners with their first website, I didn’t immediately understand the importance of keywords and search engine optimization when it comes to being found by customers who live virtually on your doorstep. And that’s where Local SEO is a key factor.

Statistics have shown that most customers will buy from businesses located within five miles of their base. It’s almost as if they trust a business more if it has a local physical address.

I now know that the key to getting on the front page of Google for local keywords is to rank in the seven-pack list of Google Places pages with one of those red balloons These are the websites that Google is giving away for free to those businesses that they recognize as being the ‘go to’ places for a particular service/product in that location.

Google makes the decision as to who gets into the top seven by the relevance of the website’s description to the search term, its proximity to the center of the location and how often they have seen that business mentioned in a variety of local listing directories/social media platforms. That’s what makes the difference between the award of a red balloon and a red dot.

However, as a small business owner with no knowledge of SEO, local or otherwise, I was not aware of the thinking behind the selection. I knew I had to have a Google Places page but, over the months, there was conflicting advice as to the best way to get one.

Some experts advised that an existing Google Places page should be claimed immediately to stop any mischievous competitor from taking it and messing around with the phone number or address whilst others suggested that it was better to get all the information needed to complete the various demands for company information and then claim it.

Still more said that, if Google had not already deemed you worthy by preparing a Places page for you for free, then you should just create your own and another group insisted that was wrong and you should promote your online presence until the big G had been nudged into noticing you.

The one thing that they all eventually agreed on was that you shouldn’t try to influence Google or any potential customers by stuffing keywords into the Title of your business listing. Sadly, it was too late for me because, by then, I had already submitted my entry with explanatory keywords after the main name of the business.

The next thing that became a consensus was that you needed to have an absolute address. PO Boxes and virtual addresses would not do at all since verification was now done via postcard and no proper address meant no possibility of proving that you did do business from that address. However, again, this became a fact after I had entered a second entry (with an incomplete address to preserve my anonymity as a home worker) because Google Places seemed to have eaten my first.

Two months later, I typed my keyword and location into the Google search box and discovered that both entries were now showing – and that neither had the correct details. To my consternation, one had a big red circle with a letter and the other the standard red lettered balloon.

That’s when I met someone who did know what they were doing and, under their guidance, we were able to start again and do things properly, in the right order, so that Google could present me with my own free website showing all the required information for local customers to get in touch.

Blogpost written by Ashley J Downs
Content Source Taken from
Read More | comments (5)

Start Google AdWords Easily for Your Websites

In this post I’ll cover the simple mechanics of Google AdWords so you can get up and running with an account in no time. I tell you its really simple and easy to do & is possibly one of the most important tools in you Internet marketing and information publishing arsenal.

Again casting my mind back to when I was pounding the streets as an ad sales rep for a local newspaper I can remember how those advertisers who were a bit savvy would look to book the primary positions – front and back page or early right hand page – that kind of thing. In fact there was one guy, had a sports shop – I’d go & see him once a year, he’d only book front or back page slots, called solus positions in those days because they were as you’ve guessed, on their own.

I often wondered why, he always paid a premium, never tried to get a discount and likewise would never take other positions even if they were ‘good discounted deals” – then I suddenly realized what he was after was ‘targeted’ traffic – he only wanted those people interested in his merchandise to respond, no time wasters and by standing out on the page he got terrific exposure and a steady stream of people who were looking for sports gear – clever, I tell you if he was still around now he’d have been one of the first online – clever guy.
Anyway here’s how to get started with Adwords, and remember the word – “TARGETED” here we go

How It Works

The way it works is that Google places your ads strategically on sites that are related to your product or you can choose to appear on the search results pages. In other words, sites where your potential customers will be and search results for what your customers are searching for answers to and information about.
Google AdWords is a very popular system bearing in mind Google’s premier position in the search engine hierarchy. When someone is looking for information online, they naturally go to Google. It makes sense that if you want to advertise to the largest number of people, you should use the world’s most popular website.

How Much It Costs
The price you pay for your ads depends on your choice of keywords and how competitive they are – it naturally follows the more popular & competitive the price of the keyword will therefore be determined by that competition; in other words, if lots of other people are using it, it’s going to be more expensive. Effective Keyword Research is Vital So again your keyword research is absolutely key, you need to drill down to closely associated (to your main keyword/niche) keywords that are less competitive and therefore more affordable. When you decide on relevant keywords that you want to target, you simply bid on them and Google gives you a price. Sounds tricky but its really not, its very simple I know for Sam & I we thought blimey we’ll never understand all that but believe me the tutorials are great and its simple.

How To Get Set Up On Google AdWords
The system is very easy to use. It has a clear and friendly interface. It takes about 5 minutes to create an account. Once you’ve set up an account, write an ad using your selected keywords. Enter your bid and set the maximum you’ll willing to pay for it. You can also decide whether the ad will be used only on Google’s search results page, or whether you’d like it to appear on other sites that use the Google AdWords system. These are sites that allow Google to place ads on them. Google places ads according to keywords used so that only relevant ads appear on sites.

You can also choose how you want the ads positioned on the site. You can pay more to have them placed higher, since people are more likely to see high posted high on a webpage. Once all of this is done, you wait for Google to approve your ads.

Here’s the Really Cool Bit – Controlling Your Cost
Here’s the best thing you set a daily budget that you are prepared to spend – word of advice here initially set it low – test it, get comfortable, see what the results are like, try a couple of ad variations – this way you don’t loose your shirt by spending too much – this is the real beauty of Google Adwords – its targeted traffic within a budget. You can instantly pause or stop your Google AdWords campaign by simply clicking on a button in the account.

If the whole process confuses you, there is no need to worry. On Google’s AdWords site area there are a number of easy to understand tutorials, that walk you through the process and give you some tips on how to get the most out of your advertising campaign.

Pay per click advertising with Google AdWords is probably the Internet’s easiest and most effective form of advertising. In fact, it’s probably the easiest and most targeted system of advertising online or off. Not only does it get you great results, it’s also extremely flexible and easy to use. So there you have it, it’s an absolute must to have in your marketing arsenal, so go there today and get your account set up.

Blogpost Written by Toby Russell
Read More | comments (1)

Listing At Dmoz Is Not Easy as You Like...

Written By M. Anandhan Mudhaliyar on Tuesday, June 28, 2011 | 11:09 PM

Tuesday, June 28, 2011

The open directory project, or Dmoz as it is better known, is the directory many webmasters want to be in. There are many myths and so-called benefits to being in Dmoz. Unfortunately, this article will not be about them. This article is to help people better understand how Dmoz works and will hopefully give you a better idea of how to submit properly. None of this article is speculation. This has all come from my conversations with the editors themselves, either from their message board or via email.
Here are the basics:

Keep in mind it can take anywhere from 2 weeks to a year or more to get listed. In order to be accepted, you need to meet their qualifications. Some of the qualifications are...

You can't submit a site that is consists of mostly affiliate links.

You can't submit an individual URL from the same site. For example if you submit you can't submit

You can't submit mirror sites. These are sites that are identical to another. (Same content, just different URL)

Do not submit a site that is "under construction" or only partially finished.

Your site can't have any illegal content, child pornography, or advocate illegal activity.

How soon can you expect your site to be accepted into the directory? It can take anywhere form a a few weeks to a year or more. Why so long? Dmoz runs on the steam of volunteers. An editor for Dmoz must make an edit at least once every four months. So, if an editor does the minimum and your number 1000 on the list in his or her category, it may take a say the least. Some editors are more active than others, so their level of commitment can greatly affect the process.

According to Dmoz's forum, only 31% of sites are submitted to the right category. How can you get listed faster? Make sure you pick the right category. If you are not sure what your category is, look at other sites in the directory and see where your competition is listed. What happens if you pick the wrong category? It will slow your sites' approval.

If an editor has the power to, he will reassign it. If not, it will be left un-reviewed until an editor with the power to move it does so. I also learned that if your site is not listed, 99% of the time it's because your site lacks quality content or content period.

If you go to submit to a category and you see "Volunteer to edit this category", that usually means there is no one assigned to that category you are applying to be in. That doesn't mean it won't be reviewed by someone higher up, but I am sure it will take even longer than usual. Oh, and if you are thinking about resubmitting every week, month or the rumored 3 months, don't. I was told that if your site is in "queue" to be evaluated, resubmitting my cause your site to go to the back of the line. That means you'll have to start from scratch again.

I know this seems like a long complicated process. The editors at Dmoz have told me that people should submit their site and forget it.

You can check out their forums and speak with editors at, but don't expect them to tell you the status of your submission. They won't, and you will not get an email saying that you have been rejected or submitted. It seems like a lose, lose situation, but have patience. The best shot you have of getting listed faster is to be in a very non-competitive category or one that doesn't have a lot of submissions or listings.
Read More | comments (1)

How To Avoid Viruses On Social Media by James Brack

Written By M. Anandhan Mudhaliyar on Saturday, June 25, 2011 | 5:18 AM

Saturday, June 25, 2011

Online thieves and scammers are increasingly using social media to scam and hustle people out of private information and even money. This is due to people’s trust in links posted by their friends – but it has now started to filter through to people that there is a spam danger with links posted by friends.

Here are the Top 10 Basic Tips to Stay Spam and Malware Free on Social Networking Websites:

Follow me @ Twitter | Facebook | Linkedin | Digg

  • Be 100% sure that links are going to respectable websites by hovering over the link to check the web address.
  • You might receive emails asking for your log-in details – but don’t enter your Facebook login anywhere other than
  • Use numbers and characters in passwords and don’t use words.  It is also recommended to change your password regularly.
  • Ensure anti-virus software is installed. Run a scan every couple of days and keep it updated with the latest updates.
  • Remove cookies.
  • Don’t add people on Facebook that you don’t know – as scammers use fake profiles to spy on you. Unfollow people on Twitter if their tweets look “unhuman”.
  • Using Wi-fi can be very spammy, as the security isn’t quite to the same level as a traditional connection. Don’t click on links in your inbox or in text messages you have received.
  • Videos that claim to show the body of Osama Bin Laden or ‘My Ex-Girlfriend Cheated on me… Here is my revenge!’ posts are spam. Don’t click on them and delete immediately. There are lots of variables of these types of scams – so always do the “hover test” to see where the link goes.
  • Remove spam and report suspicious looking links/posts.
  • Browse the security sections, such as and, just to keep you up-to-date and aware of things to avoid.

If you have clicked on a link that you think is malware, then immediately change your password, delete the post/uninstall the app, tell the social media website that you suspect that a post is spam, and run a full virus scan on your computer.

This guest post was written by James Brack

James Brack is a regular contributor to blogs on the topic of search engine optimization, social media security, and computer and mobile forensics

This Post was written by James Brack for
Read More | comments (2)

Our Sponsors

Promote Your BlogAnandhan Mudhaliyar - Find me on Bloggers.comAnandhan Mudhaliyar: SEO Tips, Tricks, Latest SEO Techniquesadd siteRSS SearchMy Zimbio Ping your blog, website, or RSS feed for FreeMy Ping in TotalPing.comGet Chitika Premium
Copyright © 2011. Latest SEO Tricks . All Rights Reserved.
Company Info | Contact Us | Privacy policy | Term of use | +Anand Mudaliar | Advertise with Us | Site map
Template modify by Creating Website. Inspired from Maskolisfree download all