You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@kafka.apache.org by manjuapu <gi...@git.apache.org> on 2017/10/26 20:18:34 UTC

[GitHub] kafka-site pull request #104: Replace link

GitHub user manjuapu opened a pull request:

    https://github.com/apache/kafka-site/pull/104

    Replace link

    @guozhangwang Please review.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/manjuapu/kafka-site replace-link

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/kafka-site/pull/104.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #104
    
----
commit 15f317907127b35eaaeb4b58c3984d5345eaf1e5
Author: Manjula Kumar <ma...@confluent.io>
Date:   2017-10-18T15:52:31Z

    MINOR: Pinterest link correction
    
    Social proof logo column swap
    
    Add zalando blog link to streams page

commit 21802dd1ff0c0ee6a076c1f2f43706830b45239c
Author: Manjula Kumar <ma...@confluent.io>
Date:   2017-10-26T20:01:05Z

    Adding nav bar to streams sub page

commit c612e3be09c545c9a06137196fda815b074fd47b
Author: Manjula Kumar <ma...@confluent.io>
Date:   2017-10-26T20:08:02Z

    Resolved conflicts

----


---

[GitHub] kafka-site pull request #104: Replace link

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/kafka-site/pull/104


---

[GitHub] kafka-site pull request #104: Replace link

Posted by joel-hamill <gi...@git.apache.org>.
Github user joel-hamill commented on a diff in the pull request:

    https://github.com/apache/kafka-site/pull/104#discussion_r147256824
  
    --- Diff: 0110/streams/quickstart.html ---
    @@ -18,11 +18,21 @@
     
     <script id="content-template" type="text/x-handlebars-template">
       <h1>Run Streams Demo Application</h1>
    -
    -<p>
    +  <div class="sub-nav-sticky">
    +      <div class="sticky-top">
    +        <div style="height:35px">
    +          <a href="/{{version}}/documentation/streams/">Introduction</a>
    +          <a href="/{{version}}/documentation/streams/developer-guide">Developers Guide</a>
    --- End diff --
    
    typo `Developer Guide`


---

[GitHub] kafka-site issue #104: Replace link

Posted by guozhangwang <gi...@git.apache.org>.
Github user guozhangwang commented on the issue:

    https://github.com/apache/kafka-site/pull/104
  
    Merged to asf-site.


---

[GitHub] kafka-site pull request #104: Replace link

Posted by joel-hamill <gi...@git.apache.org>.
Github user joel-hamill commented on a diff in the pull request:

    https://github.com/apache/kafka-site/pull/104#discussion_r147256597
  
    --- Diff: 0110/streams/core-concepts.html ---
    @@ -19,7 +19,17 @@
     
     <script id="content-template" type="text/x-handlebars-template">
         <h1>Core Concepts</h1>
    -
    +    <div class="sub-nav-sticky">
    +      <div class="sticky-top">
    +        <div style="height:35px">
    +          <a href="/{{version}}/documentation/streams/">Introduction</a>
    +          <a href="/{{version}}/documentation/streams/developer-guide">Developers Guide</a>
    --- End diff --
    
    typo `Developer Guide`


---

[GitHub] kafka-site pull request #104: Replace link

Posted by joel-hamill <gi...@git.apache.org>.
Github user joel-hamill commented on a diff in the pull request:

    https://github.com/apache/kafka-site/pull/104#discussion_r147256782
  
    --- Diff: 0110/streams/index.html ---
    @@ -17,20 +17,20 @@
     </script>
     <script id="streams-template" type="text/x-handlebars-template">
       <h1>Kafka Streams API</h1>
    -       <div class="sub-nav-sticky">
    +     <div class="sub-nav-sticky">
           <div class="sticky-top">
             <div style="height:35px">
    -          <a  class="active-menu-item" href="#">Introduction</a>
    -          <a href="/{{version}}/documentation/streams/developer-guide">Developer Guide</a>
    +          <a  class="active-menu-item" href="/{{version}}/documentation/streams/">Introduction</a>
    +          <a href="/{{version}}/documentation/streams/developer-guide">Developers Guide</a>
    --- End diff --
    
    typo `Developer Guide`


---

[GitHub] kafka-site pull request #104: Replace link

Posted by joel-hamill <gi...@git.apache.org>.
Github user joel-hamill commented on a diff in the pull request:

    https://github.com/apache/kafka-site/pull/104#discussion_r147256754
  
    --- Diff: 0110/streams/developer-guide.html ---
    @@ -19,7 +19,17 @@
     
     <script id="content-template" type="text/x-handlebars-template">
         <h1>Developer Guide for Kafka Streams API</h1>
    -    
    +    <div class="sub-nav-sticky">
    +      <div class="sticky-top">
    +        <div style="height:35px">
    +          <a href="/{{version}}/documentation/streams/">Introduction</a>
    +          <a class="active-menu-item" href="/{{version}}/documentation/streams/developer-guide">Developers Guide</a>
    --- End diff --
    
    typo `Developer Guide`


---

[GitHub] kafka-site pull request #104: Replace link

Posted by joel-hamill <gi...@git.apache.org>.
Github user joel-hamill commented on a diff in the pull request:

    https://github.com/apache/kafka-site/pull/104#discussion_r147256868
  
    --- Diff: 0110/streams/tutorial.html ---
    @@ -18,7 +18,17 @@
     
     <script id="content-template" type="text/x-handlebars-template">
         <h1>Tutorial: Write a Streams Application</h1>
    -
    +    <div class="sub-nav-sticky">
    +      <div class="sticky-top">
    +        <div style="height:35px">
    +          <a href="/{{version}}/documentation/streams/">Introduction</a>
    +          <a href="/{{version}}/documentation/streams/developer-guide">Developers Guide</a>
    --- End diff --
    
    typo `Developer Guide`


---

[GitHub] kafka-site pull request #104: Replace link

Posted by guozhangwang <gi...@git.apache.org>.
Github user guozhangwang commented on a diff in the pull request:

    https://github.com/apache/kafka-site/pull/104#discussion_r147261460
  
    --- Diff: powered-by.html ---
    @@ -2,453 +2,452 @@
     <script>
         // powered by items
         var poweredByItems = [
    -        {
    -            "link":  "https://www.nytimes.com",
    -            "logo": "NYT.jpg",
    -            "logoBgColor": "#FFFFFF",
    -            "description": "<a href='https://www.confluent.io/blog/publishing-apache-kafka-new-york-times/'>The New York Times uses Apache Kafka </a>and the Kafka Streams API to store and distribute, in real-time, published content to the various applications and systems that make it available to the readers."
    -        }, {
    -            "link":  "http://pinterest.com",
    -            "logo": "pinterest.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used with <a href='https://engineering.pinterest.com/blog/introducing-pinterest-secor' target='_blank'>Secor</a> as part of their <a href='https://www.pinterest.com/' target='_blank'>log collection pipeline</a>."
    -        }, {
    -            "link":  "http://www.zalando.com",
    -            "logo": "zalando.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "As the leading online fashion retailer in Europe, Zalando uses Kafka as an ESB (Enterprise Service Bus), which helps us in transitioning from a monolithic to a micro services architecture. Using Kafka for processing <a href 'https://kafka-summit.org/sessions/using-kstreams-ktables-calculate-real-time-domain-rankings/' target=blank'> event streams</a> enables our technical team to do near-real time business intelligence."
    -        }, {
    -            "link":  "http://linkedin.com",
    -            "logo": "linkedin.jpg",
    -            "logoBgColor": "#007bb6",
    -            "description": "Apache Kafka is used at LinkedIn for activity stream data and operational metrics. This powers various products like LinkedIn Newsfeed, LinkedIn Today in addition to our offline analytics systems like Hadoop."
    -        }, {
    -            "link":  "http://addthis.com/",
    -            "logo": "addthis.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Apache Kafka is used at AddThis to collect events generated by our data network and broker that data to our analytics clusters and real-time web analytics platform."
    -        }, {
    -            "link":  "http://www.airbnb.com/",
    -            "logo": "airbnb.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Used in our event pipeline, exception tracking & more to come."
    -        }, {
    -            "link":  "http://www.ancestry.com/",
    -            "logo": "ancestry.svg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used as the <a href='http://blogs.ancestry.com/techroots/on-track-to-data-driven' target='_blank'event log processing pipeline </a>for delivering better personalized product and service to our customers."
    -        }, {
    -            "link":  "https://boundary.com/",
    -            "logo": "boundary.gif",
    -            "logoBgColor": "#ffffff",
    -            "description": "Apache Kafka aggregates high-flow message streams into a unified distributed pubsub service, brokering the data for other internal systems as part of Boundary's real-time network analytics infrastructure."
    -        },  {
    -            "link":  "http://www.cerner.com/",
    -            "logo": "cerner.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used with HBase and Storm as described <a href='http://blog.cloudera.com/blog/2014/11/how-cerner-uses-cdh-with-apache-kafka/' target='_blank'here.</a>"
    -        }, {
    -            "link":  "https://www.coursera.org/",
    -            "logo": "coursera.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "At Coursera, Kafka powers education at scale, serving as the data pipeline for realtime learning analytics/dashboards."
    -        }, {
    -            "link":  "https://www.cloudflare.com/",
    -            "logo": "cloudfare.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "CloudFlare uses Kafka for our log processing and analytics pipeline, collecting hundreds of billions of events/day data from a thousands of servers."
    -        }, {
    -            "link":  "http://www.cloudphysics.com/",
    -            "logo": "cloudphysics.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is powering our high-flow event pipeline that aggregates over 1.2 billion metric series from 1000+ data centers for near-to-real time data center operational analytics and modeling"
    -        }, {
    -            "link":  "http://datasift.com/",
    -            "logo": "datasift.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Apache Kafka is used at DataSift as a collector of monitoring events and to track user's consumption of data streams in real time. <a href='http://highscalability.com/blog/2011/11/29/datasift-architecture-realtime-datamining-at-120000-tweets-p.html' target='_blank'>DataSift architecture</a>"
    -        }, {
    -            "link":  "http://datadog.com/",
    -            "logo": "datadog.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka brokers data to most systems in our metrics and events ingestion pipeline. Different modules contribute and consume data from it, for streaming CEP (homegrown), persistence (at different &quot;atemperatures&quot;a in Redis, ElasticSearch, Cassandra, S3), or batch analysis (Hadoop)."
    -        }, {
    -            "link":  "https://www.box.com/",
    -            "logo": "box.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "At Box, Kafka is used for the production analytics pipeline & real time monitoring infrastructure. We are planning to use Kafka for some of the new products & features"
    -        }, {
    -            "link":  "http://www.cisco.com/",
    -            "logo": "cisco.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Cisco is using Kafka as part of their OpenSOC (Security Operations Center). More details <a href='http://opensoc.github.io/' target='_blank'here.</a>"
    -        }, {
    -            "link":  "http://www.cityzendata.com/",
    -            "logo": "cityzen.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Cityzen Data uses Kafka as well, we provide a platform for collecting, storing and analyzing machine data."
    -        }, {
    -            "link":  "http://www.criteo.com/",
    -            "logo": "criteo.jpeg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Criteo uses Kafka as well, we provide a platform for collecting, storing and analyzing machine data."
    -        }, {
    -            "link":  "https://www.etsy.com/",
    -            "logo": "etsy.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "See <a href='http://siliconangle.com/blog/2015/08/11/etsy-going-all-in-with-kafka-as-dataflow-pipeline-hpbigdata15/' target='_blank'>this article</a>."
    -        }, {
    -            "link":  "http://www.exponential.com/",
    -            "logo": "exponential.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Exponential is using Kafka in production to power the events ingestion pipeline for real time analytics and log feed consumption."
    -        }, {
    -            "link":  "https://www.exoscale.ch/",
    -            "logo": "exoscale.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Exoscale uses Kafka in production."
    -        }, {
    -            "link":  "http://www.liveperson.com/",
    -            "logo": "liveperson.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Using Kafka as the main data bus for all real time events."
    -        }, {
    -            "link":  "http://www.outbrain.com/",
    -            "logo": "outbrain.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka in production for real time log collection and processing, and for cross-DC cache propagation."
    -        }, {
    -            "link":  "http://www.retentionscience.com/",
    -            "logo": "retentionscience.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Click stream ingestion and processing."
    -        }, {
    -            "link":  "http://www.strava.com/",
    -            "logo": "strava.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Powers our analytics pipeline, activity feeds denorm and several other production services."
    -        }, {
    -            "link":  "http://www.swiftkey.net/",
    -            "logo": "swiftkey.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Apache Kafka for analytics event processing."
    -        }, {
    -            "link":  "https://eng.uber.com/",
    -            "logo": "uber.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Apache Kafka is a core part of Uber’s overall infrastructure stack and powers various online & near realtime use-cases."
    -        }, {
    -            "link":  "http://emergingthreats.net/",
    -            "logo": "emergingthreats.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Emerging threats uses Kafka in our event pipeline to process billions of malware events for search indices, alerting systems, etc."
    -        }, {
    -            "link":  "http://foursquare.com/",
    -            "logo": "foursquare.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka powers online to online messaging, and online to offline messaging at Foursquare. We integrate with monitoring, production systems, and our offline infrastructure, including hadoop."
    -        }, {
    -            "link":  "http://www.goldmansachs.com/",
    -            "logo": "goldmansachs.jpg",
    -            "logoBgColor": "#64a8f1",
    -            "description": "<a href='http://www.goldmansachs.com/' target='_blank'>www.goldmansachs.com</a>"
    -        }, {
    -            "link":  "http://www.mate1.com/about",
    -            "logo": "mate1.png",
    -            "logoBgColor": "#000000",
    -            "description": "Apache kafka is used at Mate1 as our main event bus that powers our news and activity feeds, automated review systems, and will soon power real time notifications and log distribution."
    -        }, {
    -            "link":  "http://mozilla.org/",
    -            "logo": "mozilla.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka will soon be replacing part of our current production system to collect performance and usage data from the end-users browser for projects like Telemetry, Test Pilot, etc. Downstream consumers usually persist to either HDFS or HBase."
    -        },  {
    -            "link":  "http://netflix.com",
    -            "logo": "netflix.png",
    -            "logoBgColor": "#FFFFFF",
    -            "description": "Real-time monitoring and event-processing <a href='http://techblog.netflix.com/2016/04/kafka-inside-keystone-pipeline.html' target='_blank'>pipeline</a>."
    -        },  {
    -            "link":  "http://www.oracle.com/",
    -            "logo": "oracle.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Oracle provides native connectivity to Kafka from its Enterprise Service Bus product called OSB (Oracle Service Bus) which allows developers to leverage OSB built-in mediation capabilities to implement staged data pipelines."
    -        },  {
    -            "link":  "http://square.com",
    -            "logo": "square.png",
    -            "logoBgColor": "#FFFFFF",
    -            "description": "We use Kafka as a bus to move all systems events through our various datacenters. This includes metrics, logs, custom events etc. On the consumer side, we output into Splunk, Graphite, Esper-like real-time alerting."
    -        }, {
    -            "link":  "http://spotify.com",
    -            "logo": "spotify.png",
    -            "logoBgColor": "#1ed760",
    -            "description": "Kafka is used at Spotify as part of their log <a href='http://www.meetup.com/stockholm-hug/events/121628932' target='_blank'>delivery system</a>."
    -        }, {
    -            "link":  "http://www.stumbleupon.com/",
    -            "logo": "stumbleupon.png",
    -            "logoBgColor": "#eb4924",
    -            "description": "Data collection platform for analytics."
    -        }, {
    -            "link":  "http://www.tagged.com/",
    -            "logo": "tagged.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Apache Kafka drives our new pub sub system which delivers real-time events for users in our latest game - Deckadence. It will soon be used in a host of new use cases including group chat and back end stats and log collection."
    -        }, {
    -            "link":  "https://www.tumblr.com/",
    -            "logo": "tumblr.png",
    -            "logoBgColor": "#5eba8c",
    -            "description": "See <a href='http://highscalability.com/blog/2012/2/13/tumblr-architecture-15-billion-page-views-a-month-and-harder.html' target='_blank'>this</a>."
    -        }, {
    -            "link":  "http://twitter.com",
    -            "logo": "twitter.jpg",
    -            "logoBgColor": "#28a9e2",
    -            "description": "As part of their Storm stream processing infrastructure, e.g. <a href='http://engineering.twitter.com/2013/01/improving-twitter-search-with-real-time.html' target='_blank'>this</a> and <a href='https://blog.twitter.com/2015/handling-five-billion-sessions-a-day-in-real-time' target='_blank'>this</a>."
    -        }, {
    -            "link":  "http://www.paypal.com/",
    -            "logo": "paypal.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "See <a href='https://github.com/paypal/couchbasekafka' target='_blank'>this</a>."
    -        }, {
    -            "link":  "http://www.shopify.com/",
    -            "logo": "shopify.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Access logs, A/B testing events, domain events (&quot;a checkout happened&quot;, etc.), metrics, delivery to HDFS, and customer reporting. We are now focusing on consumers: analytics, support tools, and fraud analysis."
    -        },  {
    -            "link":  "http://www.oracle.com/technetwork/middleware/goldengate/overview/index.html",
    -            "logo": "oraclegoldengate.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "GoldenGate offers a comprehensive solution that streams transactional data from various sources into various big data targets including Kafka in real-time, enabling organizations to build fault -tolerant, highly reliable, and extensible analytical applications."
    -        },  {
    -            "link":  "http://www.socialtwist.com/",
    -            "logo": "socialtwist.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka internally as part of our reliable email queueing system."
    -        },  {
    -            "link":  "http://www.spongecell.com/",
    -            "logo": "spongecell.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka to run our entire analytics and monitoring pipeline driving both real-time and ETL applications for our customers."
    -        },  {
    -            "link":  "https://www.simple.com/",
    -            "logo": "simple.gif",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka at Simple for log aggregation and to power our analytics infrastructure."
    -        },  {
    -            "link":  "http://www.urbanairship.com/",
    -            "logo": "urbanairship.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "At Urban Airship we use Kafka to buffer incoming data points from mobile devices for processing by our analytics infrastructure."
    -        },  {
    -            "link":  "http://wooga.com/",
    -            "logo": "wooga.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka to aggregate and process tracking data from all our facebook games (which are hosted at various providers) in a central location."
    -        },  {
    -            "link":  "http://metamarkets.com/",
    -            "logo": "metamarkets.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka to ingest real-time event data, stream it to Storm and Hadoop, and then serve it from our Druid cluster to feed our interactive analytics dashboards. We've also built  connectors for directly ingesting events from Kafka into Druid."
    -        },  {
    -            "link":  "http://gnip.com/",
    -            "logo": "gnip.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used in their twitter ingestion and processing pipeline."
    -        },  {
    -            "link":  "http://www.flyhajj.com/",
    -            "logo": "flyhajj.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka to collect all metrics and events generated by the users of the website."
    -        }, {
    -            "link":  "http://loggly.com/",
    -            "logo": "loggly.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Loggly is the world's most popular cloud-based log management. Our cloud-based log management service helps DevOps and technical teams make sense of the the massive quantity of logs. Kafka is used as part of our <a href='http://www.loggly.com/behind-the-screens' target='_blank'log collection and processing infrastructure.</a>"
    -        },  {
    -            "link":  "http://www.richrelevance.com/",
    -            "logo": "richrelevance.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Real-time tracking event pipeline."
    -        },  {
    -            "link":  "http://www.uswitch.com/",
    -            "logo": "uswitch.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "See <a href='http://oobaloo.co.uk/kafka-for-uswitchs-event-pipeline' target='_blank'>this blog</a>."
    -        }, {
    -            "link":  "http://www.infochimps.com/",
    -            "logo": "infochimps.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is part of the <a href='http://blog.infochimps.com/2012/10/30/next-gen-real-time-streaming-storm-kafka-integration' target='_blank'>InfoChimps real-time data platform</a>."
    -        }, {
    -            "link":  "http://www.ooyala.com/",
    -            "logo": "ooyala.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used as the primary high speed message queue to power Storm and our real-time analytics/event ingestion pipelines."
    -        }, {
    -            "link":  "http://sematext.com/",
    -            "logo": "sematext.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "In <a href='http://sematext.com/spm' target='_blank'SPM</a> (performance monitoring + alerting), Kafka is used for metrics collection and feeds SPM's in-memory data aggregation (OLAP cube creation) as well as our CEP/Alerts servers (see also: <a href='http://blog.sematext.com/2013/10/16/announcement-spm-performance-monitoring-for-kafka/' target=_blank'>SPM for Kafka performance monitoring</a>). In <a href='http://sematext.com/search-analytics' target='_blank'>SA (search analytics)</a> Kafka is used in search and click stream collection before being aggregated and persisted. In <a href='http://sematext.com/logsene' target='_blank'Logsene (log analytics)</a> Kafka is used to pass logs and other events from front-end receivers to the persistent backend."
    -        }, {
    -            "link":  "http://quixey.com/",
    -            "logo": "quixey.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "At Quixey, The Search Engine for Apps, Kafka is an integral part of our eventing, logging and messaging infrastructure."
    -        }, {
    -            "link":  "http://www.linksmart.com/",
    -            "logo": "linksmart.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used at LinkSmart as an event stream feeding Hadoop and custom real time systems."
    -        }, {
    -            "link":  "http://www.lucidworks.com/products/lucidworks-big-data",
    -            "logo": "lucidworks.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka for syncing LucidWorks Search (Solr) with incoming data from Hadoop and also for sending LucidWorks Search logs back to Hadoop for analysis."
    -        }, {
    -            "link":  "http://graylog2.org/",
    -            "logo": "graylog2.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Graylog2 is a free and open source log management and data analysis system. It's using Kafka as default transport for Graylog2 Radio. The use case is described <a href='http://support.torch.sh/help/kb/graylog2-server/using-graylog2-radio-v020x' target='_blank'here</a>."
    -        }, {
    -            "link":  "http://www.visualrevenue.com/",
    -            "logo": "visualrevenue.jpg",
    -            "logoBgColor": "#1c1a88",
    -            "description": "We use Kafka as a distributed queue in front of our web traffic stream processing infrastructure (Storm)."
    -        }, {
    -            "link":  "http://www.visualdna.com/",
    -            "logo": "visualdna.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka 1. as an infrastructure that helps us bring continuously the tracking events from various datacenters into our central hadoop cluster for offline processing, 2. as a propagation path for data integration, 3. as a real-time platform for future inference and recommendation engines"
    -        }, {
    -            "link":  "http://www.wizecommerce.com/",
    -            "logo": "wizecommerce.gif",
    -            "logoBgColor": "#ffffff",
    -            "description": "At Wize Commerce (previously, NexTag), Kafka is used as a distributed queue in front of Storm based processing for search index generation. We plan to also use it for collecting user generated data on our web tier, landing the data into various data sinks like Hadoop, HBase, etc."
    -        }, {
    -            "link":  "http://www.yieldbot.com/",
    -            "logo": "yieldbot.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Yieldbot uses kafka for real-time events, camus for batch loading, and mirrormakers for x-region replication."
    -        }, {
    -            "link":  "http://yellerapp.com/",
    -            "logo": "yeller.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Yeller uses Kafka to process large streams of incoming exception data for it's customers. Rate limiting, throttling and batching are all built on top of Kafka."
    -        }, {
    -            "link":  "http://www.hotels.com/",
    -            "logo": "hotels.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Hotels.com uses Kafka as pipeline to collect real time events from multiple sources and for sending data to HDFS."
    -        }, {
    -            "link":  "http://helprace.com/help-desk",
    -            "logo": "helprace.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used as a distributed high speed message queue in our help desk software as well as our real-time event data aggregation and analytics."
    -        }, {
    -            "link":  "http://web.livefyre.com/",
    -            "logo": "livefyre.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Livefyre uses Kafka for the real time notifications, analytics pipeline and as the primary mechanism for general pub/sub."
    -        }, {
    -            "link":  "http://wikimediafoundation.org/wiki/Our_projects",
    -            "logo": "wikimedia.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Wikimedia Foundation uses Kafka as a log transport for analytics data from production webservers and applications.  This data is consumed into Hadoop using Camus and to other processors of analytics data."
    -        }, {
    -            "link":  "http://www.ovh.com/us/index.xml",
    -            "logo": "ovh.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "OVH uses Kafka in production for over a year now using it for event bus, data pipeline for antiddos and more to come."
    -        }, {
    -            "link":  "http://helpshift.com/",
    -            "logo": "helpshift.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Produces billions of events with Kafka through an erlang based producer ekaf that supports 8.0, and consumes topics primarily with storm and clojure."
    -        }, {
    -            "link":  "http://www.parsely.com/",
    -            "logo": "parsely.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used for all <a href='http://www.parsely.com/misc/slides/logs/#1' target=_blank'>data integration </a> of analytics event data."
    -        }, {
    -            "link":  "https://www.vividcortex.com/",
    -            "logo": "vividcortex.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "VividCortex uses Kafka in our SaaS MySQL performance management platform to reliably ingest high-volume 1-second timeseries data."
    -        }, {
    -            "link":  "http://www.trivago.com/",
    -            "logo": "trivago.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Trivago uses Kafka for stream processing in Storm as well as processing of application logs."
    -        }, {
    -            "link":  "http://www.ants.vn/",
    -            "logo": "ants.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Ants.vn use Kafka in production for stream processing and log transfer (over 5B messages/month and growing)"
    -        }, {
    -            "link":  "http://www.ifttt.com/",
    -            "logo": "ifttt.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka to ingest real-time log and tracking data for analytics, dashboards, and machine learning."
    -        }, {
    -            "link":  "http://homeadvisor.com/",
    -            "logo": "homeadvisor.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka for logging and async event processing, among other uses."
    -        }, {
    -            "link":  "http://www.skyscanner.net/",
    -            "logo": "skyscanner.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "The world's travel search engine, uses Kafka for real-time log and event ingestion. It is the integration point for of all stream-processing and data transportation services."
    -        }, {
    -            "link":  "https://developer.ibm.com/messaging/message-hub/",
    -            "logo": "ibmmessagehub.png",
    -            "logoBgColor": "#1e3648",
    -            "description": "The Message Hub service in our Bluemix PaaS offers Kafka-based messaging in a multi-tenant, pay-as-you-go public cloud. It's intended to provide messaging services for microservices, event-driven processing and streaming data in to analytics systems."
    -        }, {
    -            "link":  "http://www.ipinyou.com.cn/?defaultLocale=en",
    -            "logo": "ipinyou.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "The largest DSP in China which has its HQ in Beijing and offices in Shanghai, Guangzhou, Silicon Valley and Seattle. Kafka clusters are the central data hub in iPinYou. All kinds of Internet display advertising data, such as bid/no-bid, impression, click, advertiser, conversion and etc., are collected as primary data streams into Kafka brokers in real time, by LogAggregator (a substitute for Apache Flume, which is implemented in C/C++ by iPinYou, has customized functionality, better performance, lower resource-consuming)."
    -        }, {
    -            "link":  "https://mailchimp.com/",
    -            "logo": "mailchimp.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka powers MailChimp’s data pipeline that in turn powers <a href='https://mailchimp.com/pro/' target=blank'>MailChimp Pro</a>, as well as an increasing number of other product features. You can read some of the details <a href='https://devs.mailchimp.com/blog/powering-mailchimp-pro-reporting/' target=blank'>here</a>."
    -        }, {
    -            "link":  "https://www.rabobank.com",
    -            "logo": "rabobank.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Rabobank is one of the 3 largest banks in the Netherlands. Its digital nervous system, the Business Event Bus, is powered by Apache Kafka. It is used by an increasing amount of financial processes and services, one which is Rabo Alerts. This service alerts customers in real-time upon financial events and is built using Kafka Streams."
    -        },{
    -            "link":  "http://www.portoseguro.com.br/",
    -            "logo": "porto-seguro.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka in production for online and near real-time solutions. Kafka is a core part for many products, such as our Credit Card System."
    -        },{
    -            "link":  "https://empathy.micronauticsresearch.com/",
    -            "logo": "robotCircle.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "<a href 'https://empathy.micronauticsresearch.com/' target=blank'> EmpathyWorks</a> is a framework for simulating and analyzing networks of artificial personalities."
    -        },{
    -            "link":  "https://www.cj.com/",
    -            "logo": "CJ_Affiliate.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Apache Kafka is used at CJ Affiliate to process many of the key events driving our core product. Nearly every aspect of CJ's products and services currently benefit from the speed and stability this provides; additionally, Apache Kafka is one of the key technologies enabling CJ's upcoming real-time Insights & Analytics platform."
    -        }, {
    -            "link":  "http://xitenetworks.com/",
    -            "logo": "xite.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is at the heart of our Data Infrastructure - Business Intelligence, Recommender Systems and Machine Learning solutions are build as reactive and streaming architecture. Also we use Kafka as a great alternative to REST APIs for micro-services integration. This allows us to scale and reliably upgrade micro-services without integration and consistency issues."
    -        },{
    -            "link":  "http://yahoo.com",
    -            "logo": "yahoo.png",
    -            "logoBgColor": "#3d018b",
    -            "description": "See <a href='http://yahooeng.tumblr.com/post/109994930921/kafka-yahoo' target='_blank'>this</a>."
    -        }, {
    -            "link":  "https://linecorp.com/",
    -            "logo": "line.png",
    -            "logoBgColor": "#00b900",
    -            "description": "LINE uses Apache Kafka as a central datahub for our services to communicate to one another. Hundreds of billions of messages are produced daily and are used to execute various business logic, threat detection, search indexing and data analysis. LINE leverages Kafka Streams to reliably transform and filter topics enabling sub topics consumers can efficiently consume, meanwhile retaining easy maintainability thanks to its sophisticated yet minimal code base."
    -        }
    -    ];
    +    {
    --- End diff --
    
    Just a meta comment: re-arranging items and re-formatting at the same time would make the diff file very hard to review.
    
    I'd assume there is no content changes in this file, otherwise please lmk @manjuapu .


---

[GitHub] kafka-site pull request #104: Replace link

Posted by manjuapu <gi...@git.apache.org>.
Github user manjuapu commented on a diff in the pull request:

    https://github.com/apache/kafka-site/pull/104#discussion_r147262007
  
    --- Diff: powered-by.html ---
    @@ -2,453 +2,452 @@
     <script>
         // powered by items
         var poweredByItems = [
    -        {
    -            "link":  "https://www.nytimes.com",
    -            "logo": "NYT.jpg",
    -            "logoBgColor": "#FFFFFF",
    -            "description": "<a href='https://www.confluent.io/blog/publishing-apache-kafka-new-york-times/'>The New York Times uses Apache Kafka </a>and the Kafka Streams API to store and distribute, in real-time, published content to the various applications and systems that make it available to the readers."
    -        }, {
    -            "link":  "http://pinterest.com",
    -            "logo": "pinterest.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used with <a href='https://engineering.pinterest.com/blog/introducing-pinterest-secor' target='_blank'>Secor</a> as part of their <a href='https://www.pinterest.com/' target='_blank'>log collection pipeline</a>."
    -        }, {
    -            "link":  "http://www.zalando.com",
    -            "logo": "zalando.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "As the leading online fashion retailer in Europe, Zalando uses Kafka as an ESB (Enterprise Service Bus), which helps us in transitioning from a monolithic to a micro services architecture. Using Kafka for processing <a href 'https://kafka-summit.org/sessions/using-kstreams-ktables-calculate-real-time-domain-rankings/' target=blank'> event streams</a> enables our technical team to do near-real time business intelligence."
    -        }, {
    -            "link":  "http://linkedin.com",
    -            "logo": "linkedin.jpg",
    -            "logoBgColor": "#007bb6",
    -            "description": "Apache Kafka is used at LinkedIn for activity stream data and operational metrics. This powers various products like LinkedIn Newsfeed, LinkedIn Today in addition to our offline analytics systems like Hadoop."
    -        }, {
    -            "link":  "http://addthis.com/",
    -            "logo": "addthis.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Apache Kafka is used at AddThis to collect events generated by our data network and broker that data to our analytics clusters and real-time web analytics platform."
    -        }, {
    -            "link":  "http://www.airbnb.com/",
    -            "logo": "airbnb.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Used in our event pipeline, exception tracking & more to come."
    -        }, {
    -            "link":  "http://www.ancestry.com/",
    -            "logo": "ancestry.svg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used as the <a href='http://blogs.ancestry.com/techroots/on-track-to-data-driven' target='_blank'event log processing pipeline </a>for delivering better personalized product and service to our customers."
    -        }, {
    -            "link":  "https://boundary.com/",
    -            "logo": "boundary.gif",
    -            "logoBgColor": "#ffffff",
    -            "description": "Apache Kafka aggregates high-flow message streams into a unified distributed pubsub service, brokering the data for other internal systems as part of Boundary's real-time network analytics infrastructure."
    -        },  {
    -            "link":  "http://www.cerner.com/",
    -            "logo": "cerner.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used with HBase and Storm as described <a href='http://blog.cloudera.com/blog/2014/11/how-cerner-uses-cdh-with-apache-kafka/' target='_blank'here.</a>"
    -        }, {
    -            "link":  "https://www.coursera.org/",
    -            "logo": "coursera.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "At Coursera, Kafka powers education at scale, serving as the data pipeline for realtime learning analytics/dashboards."
    -        }, {
    -            "link":  "https://www.cloudflare.com/",
    -            "logo": "cloudfare.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "CloudFlare uses Kafka for our log processing and analytics pipeline, collecting hundreds of billions of events/day data from a thousands of servers."
    -        }, {
    -            "link":  "http://www.cloudphysics.com/",
    -            "logo": "cloudphysics.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is powering our high-flow event pipeline that aggregates over 1.2 billion metric series from 1000+ data centers for near-to-real time data center operational analytics and modeling"
    -        }, {
    -            "link":  "http://datasift.com/",
    -            "logo": "datasift.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Apache Kafka is used at DataSift as a collector of monitoring events and to track user's consumption of data streams in real time. <a href='http://highscalability.com/blog/2011/11/29/datasift-architecture-realtime-datamining-at-120000-tweets-p.html' target='_blank'>DataSift architecture</a>"
    -        }, {
    -            "link":  "http://datadog.com/",
    -            "logo": "datadog.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka brokers data to most systems in our metrics and events ingestion pipeline. Different modules contribute and consume data from it, for streaming CEP (homegrown), persistence (at different &quot;atemperatures&quot;a in Redis, ElasticSearch, Cassandra, S3), or batch analysis (Hadoop)."
    -        }, {
    -            "link":  "https://www.box.com/",
    -            "logo": "box.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "At Box, Kafka is used for the production analytics pipeline & real time monitoring infrastructure. We are planning to use Kafka for some of the new products & features"
    -        }, {
    -            "link":  "http://www.cisco.com/",
    -            "logo": "cisco.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Cisco is using Kafka as part of their OpenSOC (Security Operations Center). More details <a href='http://opensoc.github.io/' target='_blank'here.</a>"
    -        }, {
    -            "link":  "http://www.cityzendata.com/",
    -            "logo": "cityzen.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Cityzen Data uses Kafka as well, we provide a platform for collecting, storing and analyzing machine data."
    -        }, {
    -            "link":  "http://www.criteo.com/",
    -            "logo": "criteo.jpeg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Criteo uses Kafka as well, we provide a platform for collecting, storing and analyzing machine data."
    -        }, {
    -            "link":  "https://www.etsy.com/",
    -            "logo": "etsy.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "See <a href='http://siliconangle.com/blog/2015/08/11/etsy-going-all-in-with-kafka-as-dataflow-pipeline-hpbigdata15/' target='_blank'>this article</a>."
    -        }, {
    -            "link":  "http://www.exponential.com/",
    -            "logo": "exponential.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Exponential is using Kafka in production to power the events ingestion pipeline for real time analytics and log feed consumption."
    -        }, {
    -            "link":  "https://www.exoscale.ch/",
    -            "logo": "exoscale.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Exoscale uses Kafka in production."
    -        }, {
    -            "link":  "http://www.liveperson.com/",
    -            "logo": "liveperson.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Using Kafka as the main data bus for all real time events."
    -        }, {
    -            "link":  "http://www.outbrain.com/",
    -            "logo": "outbrain.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka in production for real time log collection and processing, and for cross-DC cache propagation."
    -        }, {
    -            "link":  "http://www.retentionscience.com/",
    -            "logo": "retentionscience.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Click stream ingestion and processing."
    -        }, {
    -            "link":  "http://www.strava.com/",
    -            "logo": "strava.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Powers our analytics pipeline, activity feeds denorm and several other production services."
    -        }, {
    -            "link":  "http://www.swiftkey.net/",
    -            "logo": "swiftkey.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Apache Kafka for analytics event processing."
    -        }, {
    -            "link":  "https://eng.uber.com/",
    -            "logo": "uber.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Apache Kafka is a core part of Uber’s overall infrastructure stack and powers various online & near realtime use-cases."
    -        }, {
    -            "link":  "http://emergingthreats.net/",
    -            "logo": "emergingthreats.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Emerging threats uses Kafka in our event pipeline to process billions of malware events for search indices, alerting systems, etc."
    -        }, {
    -            "link":  "http://foursquare.com/",
    -            "logo": "foursquare.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka powers online to online messaging, and online to offline messaging at Foursquare. We integrate with monitoring, production systems, and our offline infrastructure, including hadoop."
    -        }, {
    -            "link":  "http://www.goldmansachs.com/",
    -            "logo": "goldmansachs.jpg",
    -            "logoBgColor": "#64a8f1",
    -            "description": "<a href='http://www.goldmansachs.com/' target='_blank'>www.goldmansachs.com</a>"
    -        }, {
    -            "link":  "http://www.mate1.com/about",
    -            "logo": "mate1.png",
    -            "logoBgColor": "#000000",
    -            "description": "Apache kafka is used at Mate1 as our main event bus that powers our news and activity feeds, automated review systems, and will soon power real time notifications and log distribution."
    -        }, {
    -            "link":  "http://mozilla.org/",
    -            "logo": "mozilla.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka will soon be replacing part of our current production system to collect performance and usage data from the end-users browser for projects like Telemetry, Test Pilot, etc. Downstream consumers usually persist to either HDFS or HBase."
    -        },  {
    -            "link":  "http://netflix.com",
    -            "logo": "netflix.png",
    -            "logoBgColor": "#FFFFFF",
    -            "description": "Real-time monitoring and event-processing <a href='http://techblog.netflix.com/2016/04/kafka-inside-keystone-pipeline.html' target='_blank'>pipeline</a>."
    -        },  {
    -            "link":  "http://www.oracle.com/",
    -            "logo": "oracle.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Oracle provides native connectivity to Kafka from its Enterprise Service Bus product called OSB (Oracle Service Bus) which allows developers to leverage OSB built-in mediation capabilities to implement staged data pipelines."
    -        },  {
    -            "link":  "http://square.com",
    -            "logo": "square.png",
    -            "logoBgColor": "#FFFFFF",
    -            "description": "We use Kafka as a bus to move all systems events through our various datacenters. This includes metrics, logs, custom events etc. On the consumer side, we output into Splunk, Graphite, Esper-like real-time alerting."
    -        }, {
    -            "link":  "http://spotify.com",
    -            "logo": "spotify.png",
    -            "logoBgColor": "#1ed760",
    -            "description": "Kafka is used at Spotify as part of their log <a href='http://www.meetup.com/stockholm-hug/events/121628932' target='_blank'>delivery system</a>."
    -        }, {
    -            "link":  "http://www.stumbleupon.com/",
    -            "logo": "stumbleupon.png",
    -            "logoBgColor": "#eb4924",
    -            "description": "Data collection platform for analytics."
    -        }, {
    -            "link":  "http://www.tagged.com/",
    -            "logo": "tagged.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Apache Kafka drives our new pub sub system which delivers real-time events for users in our latest game - Deckadence. It will soon be used in a host of new use cases including group chat and back end stats and log collection."
    -        }, {
    -            "link":  "https://www.tumblr.com/",
    -            "logo": "tumblr.png",
    -            "logoBgColor": "#5eba8c",
    -            "description": "See <a href='http://highscalability.com/blog/2012/2/13/tumblr-architecture-15-billion-page-views-a-month-and-harder.html' target='_blank'>this</a>."
    -        }, {
    -            "link":  "http://twitter.com",
    -            "logo": "twitter.jpg",
    -            "logoBgColor": "#28a9e2",
    -            "description": "As part of their Storm stream processing infrastructure, e.g. <a href='http://engineering.twitter.com/2013/01/improving-twitter-search-with-real-time.html' target='_blank'>this</a> and <a href='https://blog.twitter.com/2015/handling-five-billion-sessions-a-day-in-real-time' target='_blank'>this</a>."
    -        }, {
    -            "link":  "http://www.paypal.com/",
    -            "logo": "paypal.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "See <a href='https://github.com/paypal/couchbasekafka' target='_blank'>this</a>."
    -        }, {
    -            "link":  "http://www.shopify.com/",
    -            "logo": "shopify.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Access logs, A/B testing events, domain events (&quot;a checkout happened&quot;, etc.), metrics, delivery to HDFS, and customer reporting. We are now focusing on consumers: analytics, support tools, and fraud analysis."
    -        },  {
    -            "link":  "http://www.oracle.com/technetwork/middleware/goldengate/overview/index.html",
    -            "logo": "oraclegoldengate.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "GoldenGate offers a comprehensive solution that streams transactional data from various sources into various big data targets including Kafka in real-time, enabling organizations to build fault -tolerant, highly reliable, and extensible analytical applications."
    -        },  {
    -            "link":  "http://www.socialtwist.com/",
    -            "logo": "socialtwist.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka internally as part of our reliable email queueing system."
    -        },  {
    -            "link":  "http://www.spongecell.com/",
    -            "logo": "spongecell.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka to run our entire analytics and monitoring pipeline driving both real-time and ETL applications for our customers."
    -        },  {
    -            "link":  "https://www.simple.com/",
    -            "logo": "simple.gif",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka at Simple for log aggregation and to power our analytics infrastructure."
    -        },  {
    -            "link":  "http://www.urbanairship.com/",
    -            "logo": "urbanairship.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "At Urban Airship we use Kafka to buffer incoming data points from mobile devices for processing by our analytics infrastructure."
    -        },  {
    -            "link":  "http://wooga.com/",
    -            "logo": "wooga.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka to aggregate and process tracking data from all our facebook games (which are hosted at various providers) in a central location."
    -        },  {
    -            "link":  "http://metamarkets.com/",
    -            "logo": "metamarkets.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka to ingest real-time event data, stream it to Storm and Hadoop, and then serve it from our Druid cluster to feed our interactive analytics dashboards. We've also built  connectors for directly ingesting events from Kafka into Druid."
    -        },  {
    -            "link":  "http://gnip.com/",
    -            "logo": "gnip.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used in their twitter ingestion and processing pipeline."
    -        },  {
    -            "link":  "http://www.flyhajj.com/",
    -            "logo": "flyhajj.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka to collect all metrics and events generated by the users of the website."
    -        }, {
    -            "link":  "http://loggly.com/",
    -            "logo": "loggly.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Loggly is the world's most popular cloud-based log management. Our cloud-based log management service helps DevOps and technical teams make sense of the the massive quantity of logs. Kafka is used as part of our <a href='http://www.loggly.com/behind-the-screens' target='_blank'log collection and processing infrastructure.</a>"
    -        },  {
    -            "link":  "http://www.richrelevance.com/",
    -            "logo": "richrelevance.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Real-time tracking event pipeline."
    -        },  {
    -            "link":  "http://www.uswitch.com/",
    -            "logo": "uswitch.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "See <a href='http://oobaloo.co.uk/kafka-for-uswitchs-event-pipeline' target='_blank'>this blog</a>."
    -        }, {
    -            "link":  "http://www.infochimps.com/",
    -            "logo": "infochimps.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is part of the <a href='http://blog.infochimps.com/2012/10/30/next-gen-real-time-streaming-storm-kafka-integration' target='_blank'>InfoChimps real-time data platform</a>."
    -        }, {
    -            "link":  "http://www.ooyala.com/",
    -            "logo": "ooyala.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used as the primary high speed message queue to power Storm and our real-time analytics/event ingestion pipelines."
    -        }, {
    -            "link":  "http://sematext.com/",
    -            "logo": "sematext.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "In <a href='http://sematext.com/spm' target='_blank'SPM</a> (performance monitoring + alerting), Kafka is used for metrics collection and feeds SPM's in-memory data aggregation (OLAP cube creation) as well as our CEP/Alerts servers (see also: <a href='http://blog.sematext.com/2013/10/16/announcement-spm-performance-monitoring-for-kafka/' target=_blank'>SPM for Kafka performance monitoring</a>). In <a href='http://sematext.com/search-analytics' target='_blank'>SA (search analytics)</a> Kafka is used in search and click stream collection before being aggregated and persisted. In <a href='http://sematext.com/logsene' target='_blank'Logsene (log analytics)</a> Kafka is used to pass logs and other events from front-end receivers to the persistent backend."
    -        }, {
    -            "link":  "http://quixey.com/",
    -            "logo": "quixey.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "At Quixey, The Search Engine for Apps, Kafka is an integral part of our eventing, logging and messaging infrastructure."
    -        }, {
    -            "link":  "http://www.linksmart.com/",
    -            "logo": "linksmart.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used at LinkSmart as an event stream feeding Hadoop and custom real time systems."
    -        }, {
    -            "link":  "http://www.lucidworks.com/products/lucidworks-big-data",
    -            "logo": "lucidworks.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka for syncing LucidWorks Search (Solr) with incoming data from Hadoop and also for sending LucidWorks Search logs back to Hadoop for analysis."
    -        }, {
    -            "link":  "http://graylog2.org/",
    -            "logo": "graylog2.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Graylog2 is a free and open source log management and data analysis system. It's using Kafka as default transport for Graylog2 Radio. The use case is described <a href='http://support.torch.sh/help/kb/graylog2-server/using-graylog2-radio-v020x' target='_blank'here</a>."
    -        }, {
    -            "link":  "http://www.visualrevenue.com/",
    -            "logo": "visualrevenue.jpg",
    -            "logoBgColor": "#1c1a88",
    -            "description": "We use Kafka as a distributed queue in front of our web traffic stream processing infrastructure (Storm)."
    -        }, {
    -            "link":  "http://www.visualdna.com/",
    -            "logo": "visualdna.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka 1. as an infrastructure that helps us bring continuously the tracking events from various datacenters into our central hadoop cluster for offline processing, 2. as a propagation path for data integration, 3. as a real-time platform for future inference and recommendation engines"
    -        }, {
    -            "link":  "http://www.wizecommerce.com/",
    -            "logo": "wizecommerce.gif",
    -            "logoBgColor": "#ffffff",
    -            "description": "At Wize Commerce (previously, NexTag), Kafka is used as a distributed queue in front of Storm based processing for search index generation. We plan to also use it for collecting user generated data on our web tier, landing the data into various data sinks like Hadoop, HBase, etc."
    -        }, {
    -            "link":  "http://www.yieldbot.com/",
    -            "logo": "yieldbot.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Yieldbot uses kafka for real-time events, camus for batch loading, and mirrormakers for x-region replication."
    -        }, {
    -            "link":  "http://yellerapp.com/",
    -            "logo": "yeller.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Yeller uses Kafka to process large streams of incoming exception data for it's customers. Rate limiting, throttling and batching are all built on top of Kafka."
    -        }, {
    -            "link":  "http://www.hotels.com/",
    -            "logo": "hotels.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Hotels.com uses Kafka as pipeline to collect real time events from multiple sources and for sending data to HDFS."
    -        }, {
    -            "link":  "http://helprace.com/help-desk",
    -            "logo": "helprace.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used as a distributed high speed message queue in our help desk software as well as our real-time event data aggregation and analytics."
    -        }, {
    -            "link":  "http://web.livefyre.com/",
    -            "logo": "livefyre.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Livefyre uses Kafka for the real time notifications, analytics pipeline and as the primary mechanism for general pub/sub."
    -        }, {
    -            "link":  "http://wikimediafoundation.org/wiki/Our_projects",
    -            "logo": "wikimedia.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Wikimedia Foundation uses Kafka as a log transport for analytics data from production webservers and applications.  This data is consumed into Hadoop using Camus and to other processors of analytics data."
    -        }, {
    -            "link":  "http://www.ovh.com/us/index.xml",
    -            "logo": "ovh.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "OVH uses Kafka in production for over a year now using it for event bus, data pipeline for antiddos and more to come."
    -        }, {
    -            "link":  "http://helpshift.com/",
    -            "logo": "helpshift.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Produces billions of events with Kafka through an erlang based producer ekaf that supports 8.0, and consumes topics primarily with storm and clojure."
    -        }, {
    -            "link":  "http://www.parsely.com/",
    -            "logo": "parsely.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used for all <a href='http://www.parsely.com/misc/slides/logs/#1' target=_blank'>data integration </a> of analytics event data."
    -        }, {
    -            "link":  "https://www.vividcortex.com/",
    -            "logo": "vividcortex.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "VividCortex uses Kafka in our SaaS MySQL performance management platform to reliably ingest high-volume 1-second timeseries data."
    -        }, {
    -            "link":  "http://www.trivago.com/",
    -            "logo": "trivago.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Trivago uses Kafka for stream processing in Storm as well as processing of application logs."
    -        }, {
    -            "link":  "http://www.ants.vn/",
    -            "logo": "ants.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Ants.vn use Kafka in production for stream processing and log transfer (over 5B messages/month and growing)"
    -        }, {
    -            "link":  "http://www.ifttt.com/",
    -            "logo": "ifttt.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka to ingest real-time log and tracking data for analytics, dashboards, and machine learning."
    -        }, {
    -            "link":  "http://homeadvisor.com/",
    -            "logo": "homeadvisor.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka for logging and async event processing, among other uses."
    -        }, {
    -            "link":  "http://www.skyscanner.net/",
    -            "logo": "skyscanner.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "The world's travel search engine, uses Kafka for real-time log and event ingestion. It is the integration point for of all stream-processing and data transportation services."
    -        }, {
    -            "link":  "https://developer.ibm.com/messaging/message-hub/",
    -            "logo": "ibmmessagehub.png",
    -            "logoBgColor": "#1e3648",
    -            "description": "The Message Hub service in our Bluemix PaaS offers Kafka-based messaging in a multi-tenant, pay-as-you-go public cloud. It's intended to provide messaging services for microservices, event-driven processing and streaming data in to analytics systems."
    -        }, {
    -            "link":  "http://www.ipinyou.com.cn/?defaultLocale=en",
    -            "logo": "ipinyou.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "The largest DSP in China which has its HQ in Beijing and offices in Shanghai, Guangzhou, Silicon Valley and Seattle. Kafka clusters are the central data hub in iPinYou. All kinds of Internet display advertising data, such as bid/no-bid, impression, click, advertiser, conversion and etc., are collected as primary data streams into Kafka brokers in real time, by LogAggregator (a substitute for Apache Flume, which is implemented in C/C++ by iPinYou, has customized functionality, better performance, lower resource-consuming)."
    -        }, {
    -            "link":  "https://mailchimp.com/",
    -            "logo": "mailchimp.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka powers MailChimp’s data pipeline that in turn powers <a href='https://mailchimp.com/pro/' target=blank'>MailChimp Pro</a>, as well as an increasing number of other product features. You can read some of the details <a href='https://devs.mailchimp.com/blog/powering-mailchimp-pro-reporting/' target=blank'>here</a>."
    -        }, {
    -            "link":  "https://www.rabobank.com",
    -            "logo": "rabobank.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Rabobank is one of the 3 largest banks in the Netherlands. Its digital nervous system, the Business Event Bus, is powered by Apache Kafka. It is used by an increasing amount of financial processes and services, one which is Rabo Alerts. This service alerts customers in real-time upon financial events and is built using Kafka Streams."
    -        },{
    -            "link":  "http://www.portoseguro.com.br/",
    -            "logo": "porto-seguro.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka in production for online and near real-time solutions. Kafka is a core part for many products, such as our Credit Card System."
    -        },{
    -            "link":  "https://empathy.micronauticsresearch.com/",
    -            "logo": "robotCircle.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "<a href 'https://empathy.micronauticsresearch.com/' target=blank'> EmpathyWorks</a> is a framework for simulating and analyzing networks of artificial personalities."
    -        },{
    -            "link":  "https://www.cj.com/",
    -            "logo": "CJ_Affiliate.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Apache Kafka is used at CJ Affiliate to process many of the key events driving our core product. Nearly every aspect of CJ's products and services currently benefit from the speed and stability this provides; additionally, Apache Kafka is one of the key technologies enabling CJ's upcoming real-time Insights & Analytics platform."
    -        }, {
    -            "link":  "http://xitenetworks.com/",
    -            "logo": "xite.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is at the heart of our Data Infrastructure - Business Intelligence, Recommender Systems and Machine Learning solutions are build as reactive and streaming architecture. Also we use Kafka as a great alternative to REST APIs for micro-services integration. This allows us to scale and reliably upgrade micro-services without integration and consistency issues."
    -        },{
    -            "link":  "http://yahoo.com",
    -            "logo": "yahoo.png",
    -            "logoBgColor": "#3d018b",
    -            "description": "See <a href='http://yahooeng.tumblr.com/post/109994930921/kafka-yahoo' target='_blank'>this</a>."
    -        }, {
    -            "link":  "https://linecorp.com/",
    -            "logo": "line.png",
    -            "logoBgColor": "#00b900",
    -            "description": "LINE uses Apache Kafka as a central datahub for our services to communicate to one another. Hundreds of billions of messages are produced daily and are used to execute various business logic, threat detection, search indexing and data analysis. LINE leverages Kafka Streams to reliably transform and filter topics enabling sub topics consumers can efficiently consume, meanwhile retaining easy maintainability thanks to its sophisticated yet minimal code base."
    -        }
    -    ];
    +    {
    --- End diff --
    
    @guozhangwang no content change.


---

[GitHub] kafka-site issue #104: Replace link

Posted by guozhangwang <gi...@git.apache.org>.
Github user guozhangwang commented on the issue:

    https://github.com/apache/kafka-site/pull/104
  
    LGTM. Could you submit a PR for changes in ` 0110/streams` as well in `kafka` repo?


---

[GitHub] kafka-site pull request #104: Replace link

Posted by manjuapu <gi...@git.apache.org>.
Github user manjuapu commented on a diff in the pull request:

    https://github.com/apache/kafka-site/pull/104#discussion_r147262245
  
    --- Diff: powered-by.html ---
    @@ -2,453 +2,452 @@
     <script>
         // powered by items
         var poweredByItems = [
    -        {
    -            "link":  "https://www.nytimes.com",
    -            "logo": "NYT.jpg",
    -            "logoBgColor": "#FFFFFF",
    -            "description": "<a href='https://www.confluent.io/blog/publishing-apache-kafka-new-york-times/'>The New York Times uses Apache Kafka </a>and the Kafka Streams API to store and distribute, in real-time, published content to the various applications and systems that make it available to the readers."
    -        }, {
    -            "link":  "http://pinterest.com",
    -            "logo": "pinterest.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used with <a href='https://engineering.pinterest.com/blog/introducing-pinterest-secor' target='_blank'>Secor</a> as part of their <a href='https://www.pinterest.com/' target='_blank'>log collection pipeline</a>."
    -        }, {
    -            "link":  "http://www.zalando.com",
    -            "logo": "zalando.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "As the leading online fashion retailer in Europe, Zalando uses Kafka as an ESB (Enterprise Service Bus), which helps us in transitioning from a monolithic to a micro services architecture. Using Kafka for processing <a href 'https://kafka-summit.org/sessions/using-kstreams-ktables-calculate-real-time-domain-rankings/' target=blank'> event streams</a> enables our technical team to do near-real time business intelligence."
    -        }, {
    -            "link":  "http://linkedin.com",
    -            "logo": "linkedin.jpg",
    -            "logoBgColor": "#007bb6",
    -            "description": "Apache Kafka is used at LinkedIn for activity stream data and operational metrics. This powers various products like LinkedIn Newsfeed, LinkedIn Today in addition to our offline analytics systems like Hadoop."
    -        }, {
    -            "link":  "http://addthis.com/",
    -            "logo": "addthis.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Apache Kafka is used at AddThis to collect events generated by our data network and broker that data to our analytics clusters and real-time web analytics platform."
    -        }, {
    -            "link":  "http://www.airbnb.com/",
    -            "logo": "airbnb.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Used in our event pipeline, exception tracking & more to come."
    -        }, {
    -            "link":  "http://www.ancestry.com/",
    -            "logo": "ancestry.svg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used as the <a href='http://blogs.ancestry.com/techroots/on-track-to-data-driven' target='_blank'event log processing pipeline </a>for delivering better personalized product and service to our customers."
    -        }, {
    -            "link":  "https://boundary.com/",
    -            "logo": "boundary.gif",
    -            "logoBgColor": "#ffffff",
    -            "description": "Apache Kafka aggregates high-flow message streams into a unified distributed pubsub service, brokering the data for other internal systems as part of Boundary's real-time network analytics infrastructure."
    -        },  {
    -            "link":  "http://www.cerner.com/",
    -            "logo": "cerner.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used with HBase and Storm as described <a href='http://blog.cloudera.com/blog/2014/11/how-cerner-uses-cdh-with-apache-kafka/' target='_blank'here.</a>"
    -        }, {
    -            "link":  "https://www.coursera.org/",
    -            "logo": "coursera.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "At Coursera, Kafka powers education at scale, serving as the data pipeline for realtime learning analytics/dashboards."
    -        }, {
    -            "link":  "https://www.cloudflare.com/",
    -            "logo": "cloudfare.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "CloudFlare uses Kafka for our log processing and analytics pipeline, collecting hundreds of billions of events/day data from a thousands of servers."
    -        }, {
    -            "link":  "http://www.cloudphysics.com/",
    -            "logo": "cloudphysics.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is powering our high-flow event pipeline that aggregates over 1.2 billion metric series from 1000+ data centers for near-to-real time data center operational analytics and modeling"
    -        }, {
    -            "link":  "http://datasift.com/",
    -            "logo": "datasift.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Apache Kafka is used at DataSift as a collector of monitoring events and to track user's consumption of data streams in real time. <a href='http://highscalability.com/blog/2011/11/29/datasift-architecture-realtime-datamining-at-120000-tweets-p.html' target='_blank'>DataSift architecture</a>"
    -        }, {
    -            "link":  "http://datadog.com/",
    -            "logo": "datadog.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka brokers data to most systems in our metrics and events ingestion pipeline. Different modules contribute and consume data from it, for streaming CEP (homegrown), persistence (at different &quot;atemperatures&quot;a in Redis, ElasticSearch, Cassandra, S3), or batch analysis (Hadoop)."
    -        }, {
    -            "link":  "https://www.box.com/",
    -            "logo": "box.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "At Box, Kafka is used for the production analytics pipeline & real time monitoring infrastructure. We are planning to use Kafka for some of the new products & features"
    -        }, {
    -            "link":  "http://www.cisco.com/",
    -            "logo": "cisco.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Cisco is using Kafka as part of their OpenSOC (Security Operations Center). More details <a href='http://opensoc.github.io/' target='_blank'here.</a>"
    -        }, {
    -            "link":  "http://www.cityzendata.com/",
    -            "logo": "cityzen.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Cityzen Data uses Kafka as well, we provide a platform for collecting, storing and analyzing machine data."
    -        }, {
    -            "link":  "http://www.criteo.com/",
    -            "logo": "criteo.jpeg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Criteo uses Kafka as well, we provide a platform for collecting, storing and analyzing machine data."
    -        }, {
    -            "link":  "https://www.etsy.com/",
    -            "logo": "etsy.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "See <a href='http://siliconangle.com/blog/2015/08/11/etsy-going-all-in-with-kafka-as-dataflow-pipeline-hpbigdata15/' target='_blank'>this article</a>."
    -        }, {
    -            "link":  "http://www.exponential.com/",
    -            "logo": "exponential.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Exponential is using Kafka in production to power the events ingestion pipeline for real time analytics and log feed consumption."
    -        }, {
    -            "link":  "https://www.exoscale.ch/",
    -            "logo": "exoscale.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Exoscale uses Kafka in production."
    -        }, {
    -            "link":  "http://www.liveperson.com/",
    -            "logo": "liveperson.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Using Kafka as the main data bus for all real time events."
    -        }, {
    -            "link":  "http://www.outbrain.com/",
    -            "logo": "outbrain.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka in production for real time log collection and processing, and for cross-DC cache propagation."
    -        }, {
    -            "link":  "http://www.retentionscience.com/",
    -            "logo": "retentionscience.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Click stream ingestion and processing."
    -        }, {
    -            "link":  "http://www.strava.com/",
    -            "logo": "strava.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Powers our analytics pipeline, activity feeds denorm and several other production services."
    -        }, {
    -            "link":  "http://www.swiftkey.net/",
    -            "logo": "swiftkey.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Apache Kafka for analytics event processing."
    -        }, {
    -            "link":  "https://eng.uber.com/",
    -            "logo": "uber.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Apache Kafka is a core part of Uber’s overall infrastructure stack and powers various online & near realtime use-cases."
    -        }, {
    -            "link":  "http://emergingthreats.net/",
    -            "logo": "emergingthreats.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Emerging threats uses Kafka in our event pipeline to process billions of malware events for search indices, alerting systems, etc."
    -        }, {
    -            "link":  "http://foursquare.com/",
    -            "logo": "foursquare.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka powers online to online messaging, and online to offline messaging at Foursquare. We integrate with monitoring, production systems, and our offline infrastructure, including hadoop."
    -        }, {
    -            "link":  "http://www.goldmansachs.com/",
    -            "logo": "goldmansachs.jpg",
    -            "logoBgColor": "#64a8f1",
    -            "description": "<a href='http://www.goldmansachs.com/' target='_blank'>www.goldmansachs.com</a>"
    -        }, {
    -            "link":  "http://www.mate1.com/about",
    -            "logo": "mate1.png",
    -            "logoBgColor": "#000000",
    -            "description": "Apache kafka is used at Mate1 as our main event bus that powers our news and activity feeds, automated review systems, and will soon power real time notifications and log distribution."
    -        }, {
    -            "link":  "http://mozilla.org/",
    -            "logo": "mozilla.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka will soon be replacing part of our current production system to collect performance and usage data from the end-users browser for projects like Telemetry, Test Pilot, etc. Downstream consumers usually persist to either HDFS or HBase."
    -        },  {
    -            "link":  "http://netflix.com",
    -            "logo": "netflix.png",
    -            "logoBgColor": "#FFFFFF",
    -            "description": "Real-time monitoring and event-processing <a href='http://techblog.netflix.com/2016/04/kafka-inside-keystone-pipeline.html' target='_blank'>pipeline</a>."
    -        },  {
    -            "link":  "http://www.oracle.com/",
    -            "logo": "oracle.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Oracle provides native connectivity to Kafka from its Enterprise Service Bus product called OSB (Oracle Service Bus) which allows developers to leverage OSB built-in mediation capabilities to implement staged data pipelines."
    -        },  {
    -            "link":  "http://square.com",
    -            "logo": "square.png",
    -            "logoBgColor": "#FFFFFF",
    -            "description": "We use Kafka as a bus to move all systems events through our various datacenters. This includes metrics, logs, custom events etc. On the consumer side, we output into Splunk, Graphite, Esper-like real-time alerting."
    -        }, {
    -            "link":  "http://spotify.com",
    -            "logo": "spotify.png",
    -            "logoBgColor": "#1ed760",
    -            "description": "Kafka is used at Spotify as part of their log <a href='http://www.meetup.com/stockholm-hug/events/121628932' target='_blank'>delivery system</a>."
    -        }, {
    -            "link":  "http://www.stumbleupon.com/",
    -            "logo": "stumbleupon.png",
    -            "logoBgColor": "#eb4924",
    -            "description": "Data collection platform for analytics."
    -        }, {
    -            "link":  "http://www.tagged.com/",
    -            "logo": "tagged.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Apache Kafka drives our new pub sub system which delivers real-time events for users in our latest game - Deckadence. It will soon be used in a host of new use cases including group chat and back end stats and log collection."
    -        }, {
    -            "link":  "https://www.tumblr.com/",
    -            "logo": "tumblr.png",
    -            "logoBgColor": "#5eba8c",
    -            "description": "See <a href='http://highscalability.com/blog/2012/2/13/tumblr-architecture-15-billion-page-views-a-month-and-harder.html' target='_blank'>this</a>."
    -        }, {
    -            "link":  "http://twitter.com",
    -            "logo": "twitter.jpg",
    -            "logoBgColor": "#28a9e2",
    -            "description": "As part of their Storm stream processing infrastructure, e.g. <a href='http://engineering.twitter.com/2013/01/improving-twitter-search-with-real-time.html' target='_blank'>this</a> and <a href='https://blog.twitter.com/2015/handling-five-billion-sessions-a-day-in-real-time' target='_blank'>this</a>."
    -        }, {
    -            "link":  "http://www.paypal.com/",
    -            "logo": "paypal.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "See <a href='https://github.com/paypal/couchbasekafka' target='_blank'>this</a>."
    -        }, {
    -            "link":  "http://www.shopify.com/",
    -            "logo": "shopify.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Access logs, A/B testing events, domain events (&quot;a checkout happened&quot;, etc.), metrics, delivery to HDFS, and customer reporting. We are now focusing on consumers: analytics, support tools, and fraud analysis."
    -        },  {
    -            "link":  "http://www.oracle.com/technetwork/middleware/goldengate/overview/index.html",
    -            "logo": "oraclegoldengate.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "GoldenGate offers a comprehensive solution that streams transactional data from various sources into various big data targets including Kafka in real-time, enabling organizations to build fault -tolerant, highly reliable, and extensible analytical applications."
    -        },  {
    -            "link":  "http://www.socialtwist.com/",
    -            "logo": "socialtwist.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka internally as part of our reliable email queueing system."
    -        },  {
    -            "link":  "http://www.spongecell.com/",
    -            "logo": "spongecell.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka to run our entire analytics and monitoring pipeline driving both real-time and ETL applications for our customers."
    -        },  {
    -            "link":  "https://www.simple.com/",
    -            "logo": "simple.gif",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka at Simple for log aggregation and to power our analytics infrastructure."
    -        },  {
    -            "link":  "http://www.urbanairship.com/",
    -            "logo": "urbanairship.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "At Urban Airship we use Kafka to buffer incoming data points from mobile devices for processing by our analytics infrastructure."
    -        },  {
    -            "link":  "http://wooga.com/",
    -            "logo": "wooga.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka to aggregate and process tracking data from all our facebook games (which are hosted at various providers) in a central location."
    -        },  {
    -            "link":  "http://metamarkets.com/",
    -            "logo": "metamarkets.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka to ingest real-time event data, stream it to Storm and Hadoop, and then serve it from our Druid cluster to feed our interactive analytics dashboards. We've also built  connectors for directly ingesting events from Kafka into Druid."
    -        },  {
    -            "link":  "http://gnip.com/",
    -            "logo": "gnip.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used in their twitter ingestion and processing pipeline."
    -        },  {
    -            "link":  "http://www.flyhajj.com/",
    -            "logo": "flyhajj.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka to collect all metrics and events generated by the users of the website."
    -        }, {
    -            "link":  "http://loggly.com/",
    -            "logo": "loggly.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Loggly is the world's most popular cloud-based log management. Our cloud-based log management service helps DevOps and technical teams make sense of the the massive quantity of logs. Kafka is used as part of our <a href='http://www.loggly.com/behind-the-screens' target='_blank'log collection and processing infrastructure.</a>"
    -        },  {
    -            "link":  "http://www.richrelevance.com/",
    -            "logo": "richrelevance.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Real-time tracking event pipeline."
    -        },  {
    -            "link":  "http://www.uswitch.com/",
    -            "logo": "uswitch.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "See <a href='http://oobaloo.co.uk/kafka-for-uswitchs-event-pipeline' target='_blank'>this blog</a>."
    -        }, {
    -            "link":  "http://www.infochimps.com/",
    -            "logo": "infochimps.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is part of the <a href='http://blog.infochimps.com/2012/10/30/next-gen-real-time-streaming-storm-kafka-integration' target='_blank'>InfoChimps real-time data platform</a>."
    -        }, {
    -            "link":  "http://www.ooyala.com/",
    -            "logo": "ooyala.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used as the primary high speed message queue to power Storm and our real-time analytics/event ingestion pipelines."
    -        }, {
    -            "link":  "http://sematext.com/",
    -            "logo": "sematext.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "In <a href='http://sematext.com/spm' target='_blank'SPM</a> (performance monitoring + alerting), Kafka is used for metrics collection and feeds SPM's in-memory data aggregation (OLAP cube creation) as well as our CEP/Alerts servers (see also: <a href='http://blog.sematext.com/2013/10/16/announcement-spm-performance-monitoring-for-kafka/' target=_blank'>SPM for Kafka performance monitoring</a>). In <a href='http://sematext.com/search-analytics' target='_blank'>SA (search analytics)</a> Kafka is used in search and click stream collection before being aggregated and persisted. In <a href='http://sematext.com/logsene' target='_blank'Logsene (log analytics)</a> Kafka is used to pass logs and other events from front-end receivers to the persistent backend."
    -        }, {
    -            "link":  "http://quixey.com/",
    -            "logo": "quixey.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "At Quixey, The Search Engine for Apps, Kafka is an integral part of our eventing, logging and messaging infrastructure."
    -        }, {
    -            "link":  "http://www.linksmart.com/",
    -            "logo": "linksmart.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used at LinkSmart as an event stream feeding Hadoop and custom real time systems."
    -        }, {
    -            "link":  "http://www.lucidworks.com/products/lucidworks-big-data",
    -            "logo": "lucidworks.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka for syncing LucidWorks Search (Solr) with incoming data from Hadoop and also for sending LucidWorks Search logs back to Hadoop for analysis."
    -        }, {
    -            "link":  "http://graylog2.org/",
    -            "logo": "graylog2.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Graylog2 is a free and open source log management and data analysis system. It's using Kafka as default transport for Graylog2 Radio. The use case is described <a href='http://support.torch.sh/help/kb/graylog2-server/using-graylog2-radio-v020x' target='_blank'here</a>."
    -        }, {
    -            "link":  "http://www.visualrevenue.com/",
    -            "logo": "visualrevenue.jpg",
    -            "logoBgColor": "#1c1a88",
    -            "description": "We use Kafka as a distributed queue in front of our web traffic stream processing infrastructure (Storm)."
    -        }, {
    -            "link":  "http://www.visualdna.com/",
    -            "logo": "visualdna.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka 1. as an infrastructure that helps us bring continuously the tracking events from various datacenters into our central hadoop cluster for offline processing, 2. as a propagation path for data integration, 3. as a real-time platform for future inference and recommendation engines"
    -        }, {
    -            "link":  "http://www.wizecommerce.com/",
    -            "logo": "wizecommerce.gif",
    -            "logoBgColor": "#ffffff",
    -            "description": "At Wize Commerce (previously, NexTag), Kafka is used as a distributed queue in front of Storm based processing for search index generation. We plan to also use it for collecting user generated data on our web tier, landing the data into various data sinks like Hadoop, HBase, etc."
    -        }, {
    -            "link":  "http://www.yieldbot.com/",
    -            "logo": "yieldbot.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Yieldbot uses kafka for real-time events, camus for batch loading, and mirrormakers for x-region replication."
    -        }, {
    -            "link":  "http://yellerapp.com/",
    -            "logo": "yeller.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Yeller uses Kafka to process large streams of incoming exception data for it's customers. Rate limiting, throttling and batching are all built on top of Kafka."
    -        }, {
    -            "link":  "http://www.hotels.com/",
    -            "logo": "hotels.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Hotels.com uses Kafka as pipeline to collect real time events from multiple sources and for sending data to HDFS."
    -        }, {
    -            "link":  "http://helprace.com/help-desk",
    -            "logo": "helprace.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used as a distributed high speed message queue in our help desk software as well as our real-time event data aggregation and analytics."
    -        }, {
    -            "link":  "http://web.livefyre.com/",
    -            "logo": "livefyre.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Livefyre uses Kafka for the real time notifications, analytics pipeline and as the primary mechanism for general pub/sub."
    -        }, {
    -            "link":  "http://wikimediafoundation.org/wiki/Our_projects",
    -            "logo": "wikimedia.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Wikimedia Foundation uses Kafka as a log transport for analytics data from production webservers and applications.  This data is consumed into Hadoop using Camus and to other processors of analytics data."
    -        }, {
    -            "link":  "http://www.ovh.com/us/index.xml",
    -            "logo": "ovh.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "OVH uses Kafka in production for over a year now using it for event bus, data pipeline for antiddos and more to come."
    -        }, {
    -            "link":  "http://helpshift.com/",
    -            "logo": "helpshift.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Produces billions of events with Kafka through an erlang based producer ekaf that supports 8.0, and consumes topics primarily with storm and clojure."
    -        }, {
    -            "link":  "http://www.parsely.com/",
    -            "logo": "parsely.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is used for all <a href='http://www.parsely.com/misc/slides/logs/#1' target=_blank'>data integration </a> of analytics event data."
    -        }, {
    -            "link":  "https://www.vividcortex.com/",
    -            "logo": "vividcortex.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "VividCortex uses Kafka in our SaaS MySQL performance management platform to reliably ingest high-volume 1-second timeseries data."
    -        }, {
    -            "link":  "http://www.trivago.com/",
    -            "logo": "trivago.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Trivago uses Kafka for stream processing in Storm as well as processing of application logs."
    -        }, {
    -            "link":  "http://www.ants.vn/",
    -            "logo": "ants.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Ants.vn use Kafka in production for stream processing and log transfer (over 5B messages/month and growing)"
    -        }, {
    -            "link":  "http://www.ifttt.com/",
    -            "logo": "ifttt.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka to ingest real-time log and tracking data for analytics, dashboards, and machine learning."
    -        }, {
    -            "link":  "http://homeadvisor.com/",
    -            "logo": "homeadvisor.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka for logging and async event processing, among other uses."
    -        }, {
    -            "link":  "http://www.skyscanner.net/",
    -            "logo": "skyscanner.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "The world's travel search engine, uses Kafka for real-time log and event ingestion. It is the integration point for of all stream-processing and data transportation services."
    -        }, {
    -            "link":  "https://developer.ibm.com/messaging/message-hub/",
    -            "logo": "ibmmessagehub.png",
    -            "logoBgColor": "#1e3648",
    -            "description": "The Message Hub service in our Bluemix PaaS offers Kafka-based messaging in a multi-tenant, pay-as-you-go public cloud. It's intended to provide messaging services for microservices, event-driven processing and streaming data in to analytics systems."
    -        }, {
    -            "link":  "http://www.ipinyou.com.cn/?defaultLocale=en",
    -            "logo": "ipinyou.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "The largest DSP in China which has its HQ in Beijing and offices in Shanghai, Guangzhou, Silicon Valley and Seattle. Kafka clusters are the central data hub in iPinYou. All kinds of Internet display advertising data, such as bid/no-bid, impression, click, advertiser, conversion and etc., are collected as primary data streams into Kafka brokers in real time, by LogAggregator (a substitute for Apache Flume, which is implemented in C/C++ by iPinYou, has customized functionality, better performance, lower resource-consuming)."
    -        }, {
    -            "link":  "https://mailchimp.com/",
    -            "logo": "mailchimp.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka powers MailChimp’s data pipeline that in turn powers <a href='https://mailchimp.com/pro/' target=blank'>MailChimp Pro</a>, as well as an increasing number of other product features. You can read some of the details <a href='https://devs.mailchimp.com/blog/powering-mailchimp-pro-reporting/' target=blank'>here</a>."
    -        }, {
    -            "link":  "https://www.rabobank.com",
    -            "logo": "rabobank.jpg",
    -            "logoBgColor": "#ffffff",
    -            "description": "Rabobank is one of the 3 largest banks in the Netherlands. Its digital nervous system, the Business Event Bus, is powered by Apache Kafka. It is used by an increasing amount of financial processes and services, one which is Rabo Alerts. This service alerts customers in real-time upon financial events and is built using Kafka Streams."
    -        },{
    -            "link":  "http://www.portoseguro.com.br/",
    -            "logo": "porto-seguro.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "We use Kafka in production for online and near real-time solutions. Kafka is a core part for many products, such as our Credit Card System."
    -        },{
    -            "link":  "https://empathy.micronauticsresearch.com/",
    -            "logo": "robotCircle.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "<a href 'https://empathy.micronauticsresearch.com/' target=blank'> EmpathyWorks</a> is a framework for simulating and analyzing networks of artificial personalities."
    -        },{
    -            "link":  "https://www.cj.com/",
    -            "logo": "CJ_Affiliate.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Apache Kafka is used at CJ Affiliate to process many of the key events driving our core product. Nearly every aspect of CJ's products and services currently benefit from the speed and stability this provides; additionally, Apache Kafka is one of the key technologies enabling CJ's upcoming real-time Insights & Analytics platform."
    -        }, {
    -            "link":  "http://xitenetworks.com/",
    -            "logo": "xite.png",
    -            "logoBgColor": "#ffffff",
    -            "description": "Kafka is at the heart of our Data Infrastructure - Business Intelligence, Recommender Systems and Machine Learning solutions are build as reactive and streaming architecture. Also we use Kafka as a great alternative to REST APIs for micro-services integration. This allows us to scale and reliably upgrade micro-services without integration and consistency issues."
    -        },{
    -            "link":  "http://yahoo.com",
    -            "logo": "yahoo.png",
    -            "logoBgColor": "#3d018b",
    -            "description": "See <a href='http://yahooeng.tumblr.com/post/109994930921/kafka-yahoo' target='_blank'>this</a>."
    -        }, {
    -            "link":  "https://linecorp.com/",
    -            "logo": "line.png",
    -            "logoBgColor": "#00b900",
    -            "description": "LINE uses Apache Kafka as a central datahub for our services to communicate to one another. Hundreds of billions of messages are produced daily and are used to execute various business logic, threat detection, search indexing and data analysis. LINE leverages Kafka Streams to reliably transform and filter topics enabling sub topics consumers can efficiently consume, meanwhile retaining easy maintainability thanks to its sophisticated yet minimal code base."
    -        }
    -    ];
    +    {
    --- End diff --
    
    @joel-hamill I have removed the s from Developers. And left the nav bar as it is now as per our discussion.


---