The web has become the medium of choice for people who search for information, conduct business, and enjoy entertainment.The Internet is growing and evolving every day. More and more people are becoming
part of the so-called Internet community.
At the same time, the web has also become the primary platform used by miscreants to attack users.Malicious Web sites are a cornerstone of Internet criminal activities.Online criminals who want to destroy, cheat, con others, or steal goods are evolving rapidly.
So what are our options that can aid us in recognizing malicious websites ?
There are many options,but they are not freely available for further research and only limited access to their results is publicly provided.
One of the technologies that are being implemented presently is honeypot-based technology.Honeypots are closely monitored decoys that are employed in a network to study the trail of hackers and to alert network administrators of a possible intrusion. Honeypots provide a cost effective solution to increase the security posture of an organization.The first type of honeypot was released in 1997 called the Deceptive Toolkit.
Web crawlers or Robots or also called web Spiders is also a technique used to identify harmful websites.A web crawler is a program that, given one or more seed URLs, downloads the web pages associated with these URLs, extracts any hyperlinks contained in them, and recursively continues to download the web pages identified by these hyperlinks. Web crawlers are an important component of web search engines, where they are used to collect the corpus of web pages indexed by the search engine.Web crawlers are almost as old as the web itself. In the spring of 19
93, just months after the release of NCSA Mosaic, Matthew Gray wrote the first web crawler, the World Wide Web Wanderer, which was used from 1993 to 1996 to compile statistics about the growth of the web.
The services which we can use to find the malicious websites are HoneyMonkey and SiteAdvisor .