Welcome![Sign In][Sign Up]
Location:
Downloads SourceCode Internet-Network Search Engine
Title: Crawler_src_code Download
 Description: A web crawler (also known as a web spider or ant) is a program, which browses the World Wide Web in a methodical, automated manner. Web crawlers are mainly used to create a copy of all the visited pages for later processing by a search engine, that will index the downloaded pages to provide fast searches. Crawlers can also be used for automating maintenance tasks on a web site, such as checking links, or validating HTML code. Also, crawlers can be used to gather specific types of information from Web pages, such as harvesting e-mail addresses (usually for spam).
 Downloaders recently: [More information of uploader virex]
File list (Check if you may need any files):
AboutForm.cs
AboutForm.resx
AssemblyInfo.cs
BrowseForFolder.cs
Crawler.csproj
Crawler.csproj.user
Crawler.sln
FileTypeForm.cs
FileTypeForm.resx
MainForm.cs
MainForm.resx
res
...\App.ico


...\Mem1.ICO
...\Mem10.ICO
...\Mem11.ICO
...\Mem2.ICO
...\Mem3.ICO
...\Mem4.ICO
...\Mem5.ICO
...\Mem6.ICO
...\Mem7.ICO
...\Mem8.ICO
...\Mem9.ICO
...\Pause.ICO
...\Proxy.ICO
...\Request.ico
...\Run.ICO
...\Stop.ICO
...\Thread.ico
...\TOOLS.ICO
...\TRFFC09.ICO
...\TRFFC10A.ICO
...\TRFFC10B.ICO
...\TRFFC10C.ICO
rfc766.txt.pdf
Settings.cs
SettingsForm.cs
SettingsForm.resx
SortTree.cs
    

CodeBus www.codebus.net