CodeBus
www.codebus.net
Search
Sign in
Sign up
Hot Search :
Source
embeded
web
remote control
p2p
game
More...
Location :
Home
Search - Search Engine
Main Category
SourceCode
Documents
Books
WEB Code
Develop Tools
Other resource
Sub Category
Compress-Decompress algrithms
STL
Data structs
Algorithm
AI-NN-PR
matlab
Bio-Recognize
Crypt_Decrypt algrithms
mathematica
Maple
DataMining
Big Data
comsol
physical calculation
chemical calculation
simulation modeling
Search - Search Engine - List
[
AI-NN-PR
]
bot-package-1.4
DL : 0
< 网络机器人java编程指南>>的配套源程序,研究如何实现具有Web访问能力的网络机器人的书。从Internet编程的基本原理出发,深入浅出、循序渐进地阐述了网络机器人程序Spider、Bot、Aggregator的实现技术,并分析了每种程序的优点及适用场合。本书提供了大量的有效源代码,并对这些代码进行了详细的分析。通过本书的介绍,你可以很方便地利用这些技术,设计并实现网络蜘蛛或网络信息搜索器等机器人程序。-lt; Lt; Java Web Robot Programming Guide gt; Gt; The matching source to study how to achieve the ability to visit the Web network robot book. Internet programming from the basic principles and simple, step by step elaborate procedures of the network robot Spider, Bot, aggregator of technology, and analysis of the merits of each procedure and applicable occasions. The book provides plenty of source code, as well as code for a detailed analysis. Through the book, you can easily take advantage of these technologies, network design and network information spider or other search engine robot procedures.
Date
: 2025-12-22
Size
: 1.37mb
User
:
黄
[
AI-NN-PR
]
coffeecc06
DL : 0
java 象棋源程序,人工智能,搜索引擎-java chess source, artificial intelligence, search engines
Date
: 2025-12-22
Size
: 236kb
User
:
wuxc
[
AI-NN-PR
]
carrot2-demo-webapp-2.1.3
DL : 1
聚类网站carrot2的聚类源代码,该网站属于国际公认的聚类搜索引擎-Cluster site carrot2 Cluster source code, the site belongs to the internationally recognized clustering search engine
Date
: 2025-12-22
Size
: 10.53mb
User
:
chehao
[
AI-NN-PR
]
softwarecode
DL : 0
中文分词是中文信息处理中的重要环节,中文分词技术广泛应用于自动翻译、文本检索、语音识别、文本校对、人工智能以及搜索引擎技术等领域。中文分词算法的选择,中文词库的构建方式,词库中词条的完备性在很大程度上与中文分词系统性能紧密相关。-Chinese word segmentation in Chinese information processing is an important part of Chinese word segmentation technology is widely used in automatic translation, text retrieval, speech recognition, text proofreading, artificial intelligence, as well as search engine technology. Chinese word segmentation algorithm selection, the Chinese way of building a thesaurus, the thesaurus in terms of completeness to a large extent with the Chinese word segmentation is closely related to system performance.
Date
: 2025-12-22
Size
: 890kb
User
:
李东升
[
AI-NN-PR
]
41
DL : 0
In this paper we introduce a fuzzy based engine for the scheduling problem. We assume that a human planner is unable to determine the weights for a multi-objective system but is able only to compare between pairs of objectives according to a limited vocabulary. We convert this vocabulary to fuzzy terminology and apply a fuzzy based optimum search for a solution to the scheduling problem. We demonstrate our approach using a real world scheduling system.
Date
: 2025-12-22
Size
: 3.3mb
User
:
Antony Nelson
[
AI-NN-PR
]
Topic-oriented-meta-search-engine
DL : 0
关键词:面向主题 元搜索引擎 神经网络 相关性 中文分词 -Keywords: subject-oriented meta-search engines neural network relevance Chinese word
Date
: 2025-12-22
Size
: 58kb
User
:
邵延琦
[
AI-NN-PR
]
Inverted-Indexing-for-Text-retrieval
DL : 0
Web search is the quintessential large-data problem. Given an information need expressed as a short query consisting of a few terms, the system s task is to retrieve relevant web objects (web pages, PDF documents, PowerPoint slides, etc.) and present them to the user. How large is the web? It is dicult to compute exactly, but even a conservative estimate would place the size at several tens of billions of pages, totaling hundreds of terabytes (considering text alone). In real-world applications, users demand results quickly from a search engine|query latencies longer than a few hundred milliseconds will try a user s patience. Fullling these requirements is quite an engineering feat, considering the amounts of data involved!
Date
: 2025-12-22
Size
: 584kb
User
:
mhk
[
AI-NN-PR
]
parser-cPP
DL : 0
这是网络爬虫的实现算法,网络爬虫是搜索引擎的核心部件,Google,baidu都要自己的爬虫算法,一个好的爬虫技术,是实现功能的效率提高指点。-This is the implementation algorithm of web crawler, web crawler is the core component of search engine, Google, baidu will own the crawler algorithm, a good crawler technology, is to realize the function of efficiency.
Date
: 2025-12-22
Size
: 5kb
User
:
张明
CodeBus
is one of the largest source code repositories on the Internet!
Contact us :
1999-2046
CodeBus
All Rights Reserved.