The Internet may have started as the fervent brainchild of DARPA, the US defence agency - but it quickly evolved into a network of computers at the service of a community. Academics around the world used it to communicate, compare results, compute, interact and flame each other. The ethos of the community as content-creator, source of information, fount of emotional sustenance, peer group, and social substitute is well embedded in the very fabric of the Net. Millions of members in free, advertising or subscription financed, mega-sites such as Geocities, AOL, Yahoo and Tripod generate more bits and bytes than the rest of the Internet combined. This traffic emanates from discussion groups, announcement (mailing) lists, newsgroups, and content sites (such as Suite101 and Webseed). Even the occasional visitor can find priceless gems of knowledge and opinion in the mound of trash and frivolity that these parts of the web have become.
The emergence of search engines and directories which cater only to this (sizeable) market segment was to be expected. By far the most comprehensive (and, thus, less discriminating) was Deja. It spidered and took in the exploding newsgroups (Usenet) scene with its tens of thousands of daily messages. When it was taken over by Google, its archives contained more than 500 million messages, cross-indexed every which way and pertaining to every possible (and many impossible) a topic.
Google is by far the most popular search engine yet, having surpassed the more veteran Northern Lights, Fast, and Alta Vista. Its mind defying database (more than 1.3 billion web pages), its caching technology (making it, in effect, one of the biggest libraries on earth) and its site ranking (by popularity and links-over) have rendered it unbeatable. Yet, its efforts to integrate the treasure trove that is Deja and adapt it to the Google search interface have hitherto been spectacularly unsuccessful (though it finally made it two and a half months after the purchase). So much so, that it gave birth to a protest movement.
Bickering and bad tempered flaming (often bordering on the deranged, the racial, or the stalking) are the more repulsive aspects of the Usenet groups. But at the heart of the debate this time is no ordinary sadistic venting. The issue is: who owns content generated by the public at large on computers funded by tax dollars? Can a commercial enterprise own and monopolize the fruits of the collective effort of millions of individuals from all over the world? Or should such intellectual property remain in the public domain, perhaps maintained by public institutions (such as the Library of Congress)? Should open source movements gain access to Deja's source code in order to launch Deja II? And who owns the copyright to all these messages (theoretically, the authors)? Google, as Deja before it, is offering compilations of this content, the copyright to which it does not and cannot own. The very legal concept of intellectual property is at the crux of this virtual conflict.
Google was, thus, compelled to offer free access to the CONTENT of the Deja archives to alternative (non-Google) archiving systems. But it remains mum on the search programming code and the user interface. Already one such open source group (called Dela News) is coalescing, although it is not clear who will bear the costs of the gigantic storage and processing such a project would require. Dela wants to have a physical copy of the archive deposited in trust with a dot org.
This raises a host of no less fascinating subjects. The Deja Usenet search technology, programming code, and systems are inextricable and almost indistinguishable from the Usenet archive itself. Without these elements - structural as well as dynamic - there will be no archive and no way to extract meaningful information from the chaotic bedlam that is the Usenet environment. In this case, the information lies in the ordering and classification of raw data and not in the content itself. This is why the open source proponents demand that Google share both content and the tools to access it. Google's hasty and improvised unplugging of Deja in February only served to aggravate the die-hard fans of erstwhile Deja.
The Usenet is not only the refuge of pedophiles and neo-Nazis. It includes thousands of academically rigorous and research inclined discussion groups which morph with intellectual trends and fashionable subjects. More than twenty years of wisdom and erudition are buried in servers all over the world. Scholars often visit Usenet in their pursuit of complementary knowledge or expert advice. The Usenet is also the documentation of Western intellectual history in the last three decades. In it invaluable. Google's decision to abandon the internal links between Deja messages means the disintegration of the hyperlinked fabric of this resource - unless Google comes up with an alternative (and expensive) solution.
Google is offering a better, faster, more multi-layered and multi-faceted access to the entire archive. But its brush with the more abrasive side of the open source movement brought to the surface long suppressed issues. This may be the single most important contribution of this otherwise not so opportune transaction.
About The Author
Sam Vaknin is the author of "Malignant Self Love - Narcissism Revisited" and "After the Rain - How the West Lost the East". He is a columnist in "Central Europe Review", United Press International (UPI) and ebookweb.org and the editor of mental health and Central East Europe categories in The Open Directory, Suite101 and searcheurope.com. Until recently, he served as the Economic Advisor to the Government of Macedonia.
His web site: http://samvak.tripod.com
Add to these social bookmarking sites:
SEO Traffic Lab Celebrate Wins at Digital Marketing Event 'Internet World 2013 ... - HispanicBusiness.com
ROI.com.au Named Best SEO Company in Australia by topseos.com.au for May ... - San Francisco Chronicle (press release)
Notice: Undefined index: TITLE in /var/www/vhosts/harmonyhollow.net/httpdocs/webmaster-resources/seo/inc/rss.inc on line 103
The Google Sitemap For Idiots
I don't mind admitting that every time some new fangled idea or piece of technology arrives online, I have a small fit and wonder how long it's going to take me to understand what it is, what it's for and whether I need to use it to stay 'up there'. It's even more frightening when the experts start explaining it and really only serve to confuse the matter when they use their 'techno-speak'.
The Role of the Robots.txt File to Improve Site Ranking!
Not many web master take the time to use a robots.txt file for their website.
Google Takes Care of Idiots Too
There's an old saying that goes, "God takes care of babies and idiots."Truly this statement applies to me, because when it comes to my search engine optimization skills, I'm on the idiot side of scale.
Search Engines The Masters Of The Internet Universe - Part 1
Trillions of Billions of content pages make up the wide world of Internet. Keeping a house clean and arranged with proper placement for each household item is so big a task for each of us that it is a much despised daily chore.
Non-Reciprocal Link Building For Higher Search Engine Positioning
Non-Reciprocal Link Building For Higher Search Engine Positioning By Dave Davies, Beanstalk Search Engine Positioning, Inc.There are two main advantages to non-reciprocal links as opposed to reciprocal links.
The Google Sandbox - A Frustrating Inevitability or a Golden Opportunity?
IntroductionThe Google Sandbox is a term applied to the phenomenon experienced by many new websites that delays the sites inclusion within the main Search Engine Results Pages (SERPS) of Google. Often new websites can find themselves confined to the 'Sandbox' for 6-9 months, during which time traffic to the site is severely compromised.
Google Page Rank Is Dead - Part II
In part I - Google Page Rank Is Dead - Or Ist It? http://web-marketing.smartads.
Web Site Copy is about More Than Keywords
Let's say you are writing a web site to sell beach homes on Vancouver Island, BC, Canada.You look for some good keywords and come up with 'Vancouver Island waterfront property'.
Twelve Steps to Higher Search Engine Placement
Recent studies suggest that more than 80% of new visitors to any web site get there as a result of a search engine query. If this study is to believed, it certainly suggests that working to get high rankings in the search engines might be the most effective thing you can do to bring traffic to your site.
Do the Robot!
Everyone should realize that the search engines (sponsored ads aside)are not tools for advertisement, they are meant to be tools for everday web users. Users who search the web are looking for information, thats it.
Creating A Search Engine Copywriting Plan
Search engine copywriting has become an extremely important part of the overall search engine optimization process. However, in addition, search engine copywriting has developed into a misunderstood craft.
Should You Buy Text Links?
You can rank number one (Or at least in the top ten) for just about any search phrase by just buying text link ads, even if the web site isn't related to the search phrase in anyway, it can still rank in the top ten of the search results. Some web site owners see this as the only true way to the top ten.
The Fundamentals of Inbound Links
We have all heard that adding quality content to your web site will give the search engines a good idea of how to index your web site, it's a topic we covered in "Content, Content, Content" back in November 2004. But the secret to luring the search engines to your web site, and in part to improving your position within those listings is your inbound links.
Driving Your Website through Google Sandbox
What is Google Sandbox?Google Sandbox is applied on new websites, it determines the timing of site inclusion as well as ranking in Google search engine results (SERPS). This process could take up to 6-8 months, which could be against your plans causing frustration to webmasters.
Google - PR & Backlinks
Here are some observations that I have made recently:1. On several of my own sites, I have some pages with zero page rank despite having links pointing to them PR 5 pages on the same site.
How To Start An Internet Business - Meta Tags and Keyword Density
Okay, you have a domain name, layout and content. Now we get to a step that will go a long way to determining how the site will rank.
What Makes The Perfect SEO Firm?
SEO companies come in all shapes and sizes. You've got your solo SEOs that either a) do everything themselves and/or b) sub-contract out many aspects of each campaign while maintaining a tight control on the quality and results of the project.
Do the Search Engines Know Your Website?
Are you considering a search engine promotion campaign to improve your website's search engine visibility? To aid in your decision, have you checked your website to determine its search engine awareness?Perhaps you may be thinking why do I need to check my website? Do you remember going to the doctor for an illness? Hopefully, your doctor performed some tests to diagnose your illness before prescribing your medication. If not, you may have gotten some very undesirable results.
Design A Spider Friendly Site
To be successful in the search engines it's important to design your web site with the spiders in mind. Using the latest in web page design is not generally the best way to go.
Google Patent Application - Linking
The recent patent application filed by Google details numerous items the search engine uses to rank web pages. The specific application is summarized as:"A method for scoring a document, comprising: identifying a document; obtaining one or more types of history data associated with the document; and generating a score for the document based on the one or more types of history data.
|Home | Site Map|