You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@druid.apache.org by vi...@apache.org on 2023/03/24 17:48:32 UTC

[druid] branch master updated: Fix some broken links in docs (#13968)

This is an automated email from the ASF dual-hosted git repository.

victoria pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/druid.git


The following commit(s) were added to refs/heads/master by this push:
     new 976d39281f Fix some broken links in docs (#13968)
976d39281f is described below

commit 976d39281fa8903850432abac85f1bcb067dd867
Author: Jill Osborne <ji...@imply.io>
AuthorDate: Fri Mar 24 17:48:23 2023 +0000

    Fix some broken links in docs (#13968)
---
 docs/development/extensions-core/hdfs.md | 4 ++--
 docs/ingestion/hadoop.md                 | 4 ++--
 docs/operations/other-hadoop.md          | 2 +-
 docs/operations/tls-support.md           | 7 +++----
 4 files changed, 8 insertions(+), 9 deletions(-)

diff --git a/docs/development/extensions-core/hdfs.md b/docs/development/extensions-core/hdfs.md
index 8e72cae75b..a49041b245 100644
--- a/docs/development/extensions-core/hdfs.md
+++ b/docs/development/extensions-core/hdfs.md
@@ -55,7 +55,7 @@ To use the AWS S3 as the deep storage, you need to configure `druid.storage.stor
 |`druid.storage.type`|hdfs| |Must be set.|
 |`druid.storage.storageDirectory`|s3a://bucket/example/directory or s3n://bucket/example/directory|Path to the deep storage|Must be set.|
 
-You also need to include the [Hadoop AWS module](https://hadoop.apache.org/docs/current/hadoop-aws/tools/hadoop-aws/), especially the `hadoop-aws.jar` in the Druid classpath.
+You also need to include the [Hadoop AWS module](https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/), especially the `hadoop-aws.jar` in the Druid classpath.
 Run the below command to install the `hadoop-aws.jar` file under `${DRUID_HOME}/extensions/druid-hdfs-storage` in all nodes.
 
 ```bash
@@ -64,7 +64,7 @@ cp ${DRUID_HOME}/hadoop-dependencies/hadoop-aws/${HADOOP_VERSION}/hadoop-aws-${H
 ```
 
 Finally, you need to add the below properties in the `core-site.xml`.
-For more configurations, see the [Hadoop AWS module](https://hadoop.apache.org/docs/current/hadoop-aws/tools/hadoop-aws/).
+For more configurations, see the [Hadoop AWS module](https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/).
 
 ```xml
 <property>
diff --git a/docs/ingestion/hadoop.md b/docs/ingestion/hadoop.md
index 75cc0c5114..f0a868984d 100644
--- a/docs/ingestion/hadoop.md
+++ b/docs/ingestion/hadoop.md
@@ -150,7 +150,7 @@ For example, using the static input paths:
 
 You can also read from cloud storage such as AWS S3 or Google Cloud Storage.
 To do so, you need to install the necessary library under Druid's classpath in _all MiddleManager or Indexer processes_.
-For S3, you can run the below command to install the [Hadoop AWS module](https://hadoop.apache.org/docs/current/hadoop-aws/tools/hadoop-aws/).
+For S3, you can run the below command to install the [Hadoop AWS module](https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/).
 
 ```bash
 java -classpath "${DRUID_HOME}lib/*" org.apache.druid.cli.Main tools pull-deps -h "org.apache.hadoop:hadoop-aws:${HADOOP_VERSION}";
@@ -159,7 +159,7 @@ cp ${DRUID_HOME}/hadoop-dependencies/hadoop-aws/${HADOOP_VERSION}/hadoop-aws-${H
 
 Once you install the Hadoop AWS module in all MiddleManager and Indexer processes, you can put
 your S3 paths in the inputSpec with the below job properties.
-For more configurations, see the [Hadoop AWS module](https://hadoop.apache.org/docs/current/hadoop-aws/tools/hadoop-aws/).
+For more configurations, see the [Hadoop AWS module](https://hadoop.apache.org/docs/stable/hadoop-aws/tools/hadoop-aws/).
 
 ```
 "paths" : "s3a://billy-bucket/the/data/is/here/data.gz,s3a://billy-bucket/the/data/is/here/moredata.gz,s3a://billy-bucket/the/data/is/here/evenmoredata.gz"
diff --git a/docs/operations/other-hadoop.md b/docs/operations/other-hadoop.md
index f40b118f96..14a141a195 100644
--- a/docs/operations/other-hadoop.md
+++ b/docs/operations/other-hadoop.md
@@ -78,7 +78,7 @@ The following `jobProperties` excludes `javax.validation.` classes from being lo
 }
 ```
 
-[mapred-default.xml](https://hadoop.apache.org/docs/current/hadoop-mapreduce-client/hadoop-mapreduce-client-core/mapred-default.xml) documentation contains more information about this property.
+[mapred-default.xml](https://hadoop.apache.org/docs/stable/hadoop-mapreduce-client/hadoop-mapreduce-client-core/mapred-default.xml) documentation contains more information about this property.
 
 ## Tip #3: Use specific versions of Hadoop libraries
 
diff --git a/docs/operations/tls-support.md b/docs/operations/tls-support.md
index 7af0ec8050..7189af9f2f 100644
--- a/docs/operations/tls-support.md
+++ b/docs/operations/tls-support.md
@@ -2,7 +2,6 @@
 id: tls-support
 title: "TLS support"
 ---
-
 <!--
   ~ Licensed to the Apache Software Foundation (ASF) under one
   ~ or more contributor license agreements.  See the NOTICE file
@@ -37,10 +36,10 @@ and `druid.tlsPort` properties on each process. Please see `Configuration` secti
 Apache Druid uses Jetty as its embedded web server. 
 
 To get familiar with TLS/SSL, along with related concepts like keys and certificates,
-read [Configuring SSL/TLS](https://www.eclipse.org/jetty/documentation/current/configuring-ssl.html) in the Jetty documentation.
+read [Configuring Secure Protocols](https://www.eclipse.org/jetty/documentation/jetty-12/operations-guide/index.html#og-protocols-ssl) in the Jetty documentation.
 To get more in-depth knowledge of TLS/SSL support in Java in general, refer to the [Java Secure Socket Extension (JSSE) Reference Guide](http://docs.oracle.com/javase/8/docs/technotes/guides/security/jsse/JSSERefGuide.html).
-The [Configuring the Jetty SslContextFactory](https://www.eclipse.org/jetty/documentation/current/configuring-ssl.html#configuring-sslcontextfactory)
-section can help in understanding TLS/SSL configurations listed below. Finally, [Java Cryptography Architecture
+The [Class SslContextFactory](https://www.eclipse.org/jetty/javadoc/jetty-11/org/eclipse/jetty/util/ssl/SslContextFactory.html)
+reference doc can help in understanding TLS/SSL configurations listed below. Finally, [Java Cryptography Architecture
 Standard Algorithm Name Documentation for JDK 8](http://docs.oracle.com/javase/8/docs/technotes/guides/security/StandardNames.html) lists all possible
 values for the configs below, among others provided by Java implementation.
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@druid.apache.org
For additional commands, e-mail: commits-help@druid.apache.org