Argelius
|
  |
| Joined: 19 Jul 2010 |
| Total Posts: 4047 |
|
|
| 14 Aug 2014 08:06 PM |
I'm assuming they have a database full of websites that Googlebot has crawled. My question is about this:
"About 1,050,000,000 results (0.35 seconds)"
What is this speed compared to an average SQL database? Looks ridiculously fast... |
|
|
| Report Abuse |
|
|
|
| 14 Aug 2014 08:17 PM |
You need a good nonfiction covering the internet in general, packeting, and Google's algorithms. If only I remembered the title of the book...
Oh also, you should learn about two step checksums for errors.
PageRank is what you should look up, but that is nowhere near as fun as a nonfiction book. :C |
|
|
| Report Abuse |
|
|
|
| 14 Aug 2014 08:24 PM |
And maybe you should learn about public-key cryptography. And how compression algorithms work. |
|
|
| Report Abuse |
|
|
Argelius
|
  |
| Joined: 19 Jul 2010 |
| Total Posts: 4047 |
|
|
| 14 Aug 2014 09:05 PM |
| Actually I was looking for http://research.google.com/archive/bigtable.html |
|
|
| Report Abuse |
|
|
|
| 14 Aug 2014 09:22 PM |
| I have no idea what that has to do with PageRank, nonfiction books, nor a Google search. |
|
|
| Report Abuse |
|
|
Argelius
|
  |
| Joined: 19 Jul 2010 |
| Total Posts: 4047 |
|
|
| 15 Aug 2014 07:27 AM |
| I think it is the DBMS of the database(s) that Google actually uses when comparing keywords in a search to the actual content on web pages. |
|
|
| Report Abuse |
|
|
|
| 15 Aug 2014 09:20 AM |
Yeah, PageRank is what does it, and it searches the actual results. I'm guessing they used self-learning computer algorithms to tell it to reject spam and such.
Random Surfer does a good job overall, right? |
|
|
| Report Abuse |
|
|