Crawling is also known as spider or Google bot. It visits on the web pages and check that everything is ok in the site if it finds duplicity it removes that page or site. The search engine crawler visits each web page and identifies all the hyperlinks on the page, adding them to the list of places to crawl.
Indexing process is a simple file management technique that organizes data into special file folders, similar to a file label in a file cabinet. Internet search engines use special indexing techniques that store Meta data about web sites and content.
Indexing process is a simple file management technique that organizes data into special file folders, similar to a file label in a file cabinet. Internet search engines use special indexing techniques that store Meta data about web sites and content.