You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-commits@hadoop.apache.org by Apache Wiki <wi...@apache.org> on 2010/12/08 02:35:49 UTC
[Hadoop Wiki] Update of "PoweredBy" by jesily
Dear Wiki user,
You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.
The "PoweredBy" page has been changed by jesily.
http://wiki.apache.org/hadoop/PoweredBy?action=diff&rev1=238&rev2=239
--------------------------------------------------
- This page documents an alphabetical list of institutions that are using Hadoop for educational or production uses. Companies that offer services on or based around Hadoop are listed in [[Distributions and Commercial Support]] .
+ This page documents an alphabetical list of institutions that are using Hadoop for educational or production uses. Companies that offer services on or based around Hadoop are listed in [[Distributions and Commercial Support]] .
<<TableOfContents(3)>>
@@ -170, +170 @@
* We also uses Hadoop to analyzing similarities of user's behavior.
= G =
+ * [[http://www.gis.tw/en|GIS.FCU]]
+ * Feng Chia University
+ * 3 machine cluster
+ * storeage for sensor data
* [[http://www.google.com|Google]]
* [[http://www.google.com/intl/en/press/pressrel/20071008_ibm_univ.html|University Initiative to Address Internet-Scale Computing Challenges]]
@@ -269, +273 @@
* HBase & Hadoop version 0.20
* [[http://www.linkedin.com|LinkedIn]]
- * We have multiple grids divided up based upon purpose. They are composed of the following types of hardware:
+ * We have multiple grids divided up based upon purpose. They are composed of the following types of hardware:
* 100 Nehalem-based nodes, with 2x4 cores, 24GB RAM, 8x1TB storage using ZFS in a JBOD configuration on Solaris.
* 120 Westmere-based nodes, with 2x4 cores, 24GB RAM, 6x2TB storage using ext4 in a JBOD configuration on CentOS 5.5
* We use Hadoop and Pig for discovering People You May Know and other fun facts.