You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-commits@hadoop.apache.org by Apache Wiki <wi...@apache.org> on 2011/05/13 23:57:37 UTC

PoweredBy reverted to revision 282 on Hadoop Wiki

Dear wiki user,

You have subscribed to a wiki page "Hadoop Wiki" for change notification.

The page PoweredBy has been reverted to revision 282 by DougCutting.
The comment on this change is: spam.
http://wiki.apache.org/hadoop/PoweredBy?action=diff&rev1=283&rev2=284

--------------------------------------------------

  
    * Our production cluster has been running since Oct 2008.
  
-  * [[http://dentaldentistsolutions.blogspot.com/2009/10/process-and-pictures-dental-implants.html|Dental Implants]]
+  * [[http://www.adyard.de|adyard]]
    * We use Flume, Hadoop and Pig for log storage and report generation aswell as ad-Targeting.
    * We currently have 12 nodes running HDFS and Pig and plan to add more from time to time.
    * 50% of our recommender system is pure Pig because of it's ease of use.
@@ -309, +309 @@

    * We have multiple grids divided up based upon purpose. They are composed of the following types of hardware:
     * 120 Nehalem-based nodes, with 2x4 cores, 24GB RAM, 8x1TB storage using ext4 in a JBOD configuration on CentOS 5.5.
     * 520 Westmere-based nodes, with 2x4 cores, 24GB RAM, 6x2TB storage using ext4 in a JBOD configuration on CentOS 5.5.
-   * We use modified versions of Apache's Hadoop and Pig distributions for discovering People You May Know and [[http://twitter.com/007simple|other]] [[http://itshumour.blogspot.com/2010/06/twenty-hilarious-funny-quotes.html|fun]] [[http://identi.ca/simple007|facts]].
+   * We use modified versions of Apache's Hadoop and Pig distributions for discovering People You May Know and [[http://www.linkedin.com/careerexplorer/dashboard|other]] [[http://inmaps.linkedinlabs.com/|fun]] [[http://www.linkedin.com/skills/|facts]].
  
   * [[http://www.lookery.com|Lookery]]
    * We use Hadoop to process clickstream and demographic data in order to create web analytic reports.
@@ -545, +545 @@

    * We also use Hadoop for filtering and indexing listing, processing log analysis, and for recommendation data.
  
  = W =
-  * [[http://wiki.citizen.apps.gov/pillbox/bin/view/Main/SussaneNg|Web Alliance]]
+  * [[http://www.web-alliance.fr|Web Alliance]]
    * We use Hadoop for our internal search engine optimization (SEO) tools. It allows us to store, index, search data in a much faster way.
    * We also use it for logs analysis and trends prediction.
   * [[http://www.worldlingo.com/|WorldLingo]]