You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-commits@hadoop.apache.org by Apache Wiki <wi...@apache.org> on 2009/05/20 17:19:21 UTC

[Hadoop Wiki] Update of "PoweredBy" by RussGarrett

Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The following page has been changed by RussGarrett:
http://wiki.apache.org/hadoop/PoweredBy

The comment on the change is:
Update last.fm stats, fix broken alphabetical ordering...

------------------------------------------------------------------------------
    * 3 X 20 machine cluster (8 cores/machine, 2TB/machine storage)
    * 10 machine cluster (8 cores/machine, 1TB/machine storage)
    * Use for log analysis, data mining and machine learning
+ 
+  * [http://www.google.com Google]
+   * [http://www.google.com/intl/en/press/pressrel/20071008_ibm_univ.html University Initiative to Address Internet-Scale Computing Challenges]
+ 
+  * [http://www.gruter.com Gruter. Corp.]
+   * 30 machine cluster  (4 cores, 1TB~2TB/machine storage)
+   * storage for blog data and web documents
+   * used for data indexing by MapReduce
+   * link analyzing and Machine Learning by MapReduce
  
   * [http://www.hadoop.co.kr/ Hadoop Korean User Group], a Korean Local Community Team Page.
    * 50 node cluster In the Korea university network environment.
@@ -93, +102 @@

    * 13 machine cluster (8 cores/machine, 4TB/machine)
    * Log storage and analysis
    * Hbase hosting
- 
-  * [http://www.google.com Google]
-   * [http://www.google.com/intl/en/press/pressrel/20071008_ibm_univ.html University Initiative to Address Internet-Scale Computing Challenges]
- 
-  * [http://www.gruter.com Gruter. Corp.]
-   * 30 machine cluster  (4 cores, 1TB~2TB/machine storage)
-   * storage for blog data and web documents
-   * used for data indexing by MapReduce
-   * link analyzing and Machine Learning by MapReduce
- 
  
   * [http://www.hadoop.tw/ Hadoop Taiwan User Group]
  
@@ -155, +154 @@

    * Source code search engine uses Hadoop and Nutch.
  
   * [http://www.last.fm Last.fm]
-   * 50 nodes (dual xeon LV 2GHz, 4GB RAM, 1TB/node storage and dual xeon L5320 1.86GHz, 8GB RAM, 3TB/node storage).
+   * 27 nodes
+   * Dual quad-core Xeon L5520 (Nehalem) @ 2.27GHz, 16GB RAM, 4TB/node storage.
    * Used for charts calculation, log analysis, A/B testing
  
   * [http://www.lookery.com Lookery]