Home > news > Search Engine ranking optimization is how to sort Web pages
Search Engine ranking optimization is how to sort Web pages
- 2018-10-31-

Search engineRanking optimizationHow to sort a Web page

We are familiar with the search engine has Baidu,Bing, Google, etc.,Site ranking OptimizationIn addition to Baidu useOf"By Money ranking Method",In search enginesRanking optimizationThe most basic and core sorting algorithm comes fromGoogle's PageRank algorithm, the inventor of which is Google's founder Larry Page and Sergei Brin. The invention made a qualitative leap forward in the relevance of searches around 1998, exaggerating that the success of Google's entire company could be thought to be based on the algorithm.


What the hell is PageRank? To put it simply is a democratic vote. On the Internet, if a Web page is linked by many other pages, indicating that it is universally recognized and trusted, then it ranks high. For example, for a page related to the same keyword, a Web page is linked to 100 other pages, you might want to set it to get 100 votes, his ranking will be more than only 10 pages linked, that is, only 10 votes of the page ranking higher.



But here's a question, as"Voter", different pages of the authority is not the same, such as a Web page by 100 Spam site links, the other by 50 authoritative sites linked, apparently the latter 50 votes more reliable than the former 100 votes. In order to solve this problem, we on different pages according to its reliability to assign its corresponding "right", the more reliable the "right" of the Web page, the greater the use of weight calculation can avoid this problem. In this case, the authoritative website "right" is 5, the garbage site is 1, then the former can be divided into 1x100=100, the latter score 5x50=250, so the latter ranking high. The result is clearly more reasonable.


However, there is still a problem that needs to be solved, that is,How can "right" be determined? The problem of page and Brin converting this into a two-dimensional matrix multiplication assumes that the "rights" of all sites are the same, proving that the results converge to the true value after the matrix has been iterated. In practice, the number of pages is huge, 1 billion page matrices will have 10 billion elements, such a large amount of computation is not realistic for the existing technology, and Page and Brin use the technique of sparse matrices to simplify the calculation. Specific mathematical methods are interested in being able to understand for themselves.


Today, search engines are fully functional and complex, butThe PageRank algorithm is still critical.