You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@commons.apache.org by lu...@apache.org on 2013/10/27 10:52:44 UTC
svn commit: r1536073 - in /commons/proper/math/trunk/src: changes/changes.xml
main/java/org/apache/commons/math3/analysis/differentiation/SparseGradient.java
test/java/org/apache/commons/math3/analysis/differentiation/SparseGradientTest.java
Author: luc
Date: Sun Oct 27 09:52:44 2013
New Revision: 1536073
URL: http://svn.apache.org/r1536073
Log:
Added SparseGradient to deal efficiently with numerous variables.
This new class is devoted to differentiation computation when the number
of variables is very large but most computations depend only on a few of
subset of the variables.
Thanks to Ajo Fod for the contribution.
JIRA: MATH-1036
Added:
commons/proper/math/trunk/src/main/java/org/apache/commons/math3/analysis/differentiation/SparseGradient.java (with props)
commons/proper/math/trunk/src/test/java/org/apache/commons/math3/analysis/differentiation/SparseGradientTest.java (with props)
Modified:
commons/proper/math/trunk/src/changes/changes.xml
Modified: commons/proper/math/trunk/src/changes/changes.xml
URL: http://svn.apache.org/viewvc/commons/proper/math/trunk/src/changes/changes.xml?rev=1536073&r1=1536072&r2=1536073&view=diff
==============================================================================
--- commons/proper/math/trunk/src/changes/changes.xml (original)
+++ commons/proper/math/trunk/src/changes/changes.xml Sun Oct 27 09:52:44 2013
@@ -51,6 +51,11 @@ If the output is not quite correct, chec
</properties>
<body>
<release version="x.y" date="TBD" description="TBD">
+ <action dev="luc" type="add" issue="MATH-1036" due-to="Ajo Fod">
+ Added SparseGradient to deal efficiently with first derivatives when the number
+ of variables is very large but most computations depend only on a few of the
+ variables.
+ </action>
<action dev="psteitz" type="update" issue="MATH-1039" due-to="Aleksei Dievskii">
Added logDensity methods to AbstractReal/IntegerDistribution with naive default
implementations and improved implementations for some current distributions.
Added: commons/proper/math/trunk/src/main/java/org/apache/commons/math3/analysis/differentiation/SparseGradient.java
URL: http://svn.apache.org/viewvc/commons/proper/math/trunk/src/main/java/org/apache/commons/math3/analysis/differentiation/SparseGradient.java?rev=1536073&view=auto
==============================================================================
--- commons/proper/math/trunk/src/main/java/org/apache/commons/math3/analysis/differentiation/SparseGradient.java (added)
+++ commons/proper/math/trunk/src/main/java/org/apache/commons/math3/analysis/differentiation/SparseGradient.java Sun Oct 27 09:52:44 2013
@@ -0,0 +1,926 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.commons.math3.analysis.differentiation;
+
+import java.io.Serializable;
+import java.util.Collections;
+import java.util.HashMap;
+import java.util.Map;
+
+import org.apache.commons.math3.Field;
+import org.apache.commons.math3.FieldElement;
+import org.apache.commons.math3.RealFieldElement;
+import org.apache.commons.math3.exception.DimensionMismatchException;
+import org.apache.commons.math3.util.FastMath;
+import org.apache.commons.math3.util.MathArrays;
+import org.apache.commons.math3.util.MathUtils;
+import org.apache.commons.math3.util.Precision;
+
+/**
+ * First derivative computation with large number of variables.
+ * <p>
+ * This class plays a similar role to {@link DerivativeStructure}, with
+ * a focus on efficiency when dealing with large numer of independent variables
+ * and most computation depend only on a few of them, and when only first derivative
+ * is desired. When these conditions are met, this class should be much faster than
+ * {@link DerivativeStructure} and use less memory.
+ * </p>
+ *
+ * @version $Id$
+ * @since 3.3
+ */
+public class SparseGradient implements RealFieldElement<SparseGradient>, Serializable {
+
+ /** Serializable UID. */
+ private static final long serialVersionUID = 20131025L;
+
+ /** Value of the calculation. */
+ private double value;
+
+ /** Stored derivative, each key representing a different independent variable. */
+ private final Map<Integer, Double> derivatives;
+
+ /** Internal constructor.
+ * @param value value of the function
+ * @param derivatives derivatives map, a deep copy will be performed,
+ * so the map given here will remain safe from changes in the new instance,
+ * may be null to create an empty derivatives map, i.e. a constant value
+ */
+ private SparseGradient(final double value, final Map<Integer, Double> derivatives) {
+ this.value = value;
+ this.derivatives = new HashMap<Integer, Double>();
+ if (derivatives != null) {
+ this.derivatives.putAll(derivatives);
+ }
+ }
+
+ /** Internal constructor.
+ * @param value value of the function
+ * @param scale scaling factor to apply to all deerivatives
+ * @param derivatives derivatives map, a deep copy will be performed,
+ * so the map given here will remain safe from changes in the new instance,
+ * may be null to create an empty derivatives map, i.e. a constant value
+ */
+ private SparseGradient(final double value, final double scale,
+ final Map<Integer, Double> derivatives) {
+ this.value = value;
+ this.derivatives = new HashMap<Integer, Double>();
+ if (derivatives != null) {
+ for (final Map.Entry<Integer, Double> entry : derivatives.entrySet()) {
+ this.derivatives.put(entry.getKey(), scale * entry.getValue());
+ }
+ }
+ }
+
+ /** Factory method creating a constant.
+ * @param value value of the constant
+ * @return a new instance
+ */
+ public static SparseGradient createConstant(final double value) {
+ return new SparseGradient(value, Collections.<Integer, Double> emptyMap());
+ }
+
+ /** Factory method creating an independent variable.
+ * @param idx index of the variable
+ * @param value value of the variable
+ * @return a new instance
+ */
+ public static SparseGradient createVariable(final int idx, final double value) {
+ return new SparseGradient(value, Collections.singletonMap(idx, 1.0));
+ }
+
+ /**
+ * Find the number of variables.
+ * @return number of variables
+ */
+ public int numVars() {
+ return derivatives.size();
+ }
+
+ /**
+ * Get the derivative with respect to a particular index variable.
+ *
+ * @param index index to differentiate with.
+ * @return derivative with respect to a particular index variable
+ */
+ public double getDerivative(final int index) {
+ final Double out = derivatives.get(index);
+ return (out == null) ? 0.0 : out;
+ }
+
+ /**
+ * Get the value of the function.
+ * @return value of the function.
+ */
+ public double getValue() {
+ return value;
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public double getReal() {
+ return value;
+ }
+
+ /** {@inheritDoc} */
+ public SparseGradient add(final SparseGradient a) {
+ final SparseGradient out = new SparseGradient(value + a.value, derivatives);
+ for (Map.Entry<Integer, Double> entry : a.derivatives.entrySet()) {
+ final int id = entry.getKey();
+ final Double old = out.derivatives.get(id);
+ if (old == null) {
+ out.derivatives.put(id, entry.getValue());
+ } else {
+ out.derivatives.put(id, old + entry.getValue());
+ }
+ }
+
+ return out;
+ }
+
+ /**
+ * Add in place.
+ * <p>
+ * This method is designed to be faster when used multiple times in a loop.
+ * </p>
+ * <p>
+ * The instance is changed here, in order to not change the
+ * instance the {@link #add(SparseGradient)} method should
+ * be used.
+ * </p>
+ * @param a instance to add
+ */
+ public void addInPlace(final SparseGradient a) {
+ value += a.value;
+ for (final Map.Entry<Integer, Double> entry : a.derivatives.entrySet()) {
+ final int id = entry.getKey();
+ final Double old = derivatives.get(id);
+ if (old == null) {
+ derivatives.put(id, entry.getValue());
+ } else {
+ derivatives.put(id, old + entry.getValue());
+ }
+ }
+ }
+
+ /** {@inheritDoc} */
+ public SparseGradient add(final double c) {
+ final SparseGradient out = new SparseGradient(value + c, derivatives);
+ return out;
+ }
+
+ /** {@inheritDoc} */
+ public SparseGradient subtract(final SparseGradient a) {
+ final SparseGradient out = new SparseGradient(value - a.value, derivatives);
+ for (Map.Entry<Integer, Double> entry : a.derivatives.entrySet()) {
+ final int id = entry.getKey();
+ final Double old = out.derivatives.get(id);
+ if (old == null) {
+ out.derivatives.put(id, -entry.getValue());
+ } else {
+ out.derivatives.put(id, old - entry.getValue());
+ }
+ }
+ return out;
+ }
+
+ /** {@inheritDoc} */
+ public SparseGradient subtract(double c) {
+ return new SparseGradient(value - c, derivatives);
+ }
+
+ /** {@inheritDoc} */
+ public SparseGradient multiply(final SparseGradient a) {
+ final SparseGradient out =
+ new SparseGradient(value * a.value, Collections.<Integer, Double> emptyMap());
+
+ // Derivatives.
+ for (Map.Entry<Integer, Double> entry : derivatives.entrySet()) {
+ out.derivatives.put(entry.getKey(), a.value * entry.getValue());
+ }
+ for (Map.Entry<Integer, Double> entry : a.derivatives.entrySet()) {
+ final int id = entry.getKey();
+ final Double old = out.derivatives.get(id);
+ if (old == null) {
+ out.derivatives.put(id, value * entry.getValue());
+ } else {
+ out.derivatives.put(id, old + value * entry.getValue());
+ }
+ }
+ return out;
+ }
+
+ /**
+ * Multiply in place.
+ * <p>
+ * This method is designed to be faster when used multiple times in a loop.
+ * </p>
+ * <p>
+ * The instance is changed here, in order to not change the
+ * instance the {@link #add(SparseGradient)} method should
+ * be used.
+ * </p>
+ * @param a instance to multiply
+ */
+ public void multInPlace(final SparseGradient a) {
+ // Derivatives.
+ for (Map.Entry<Integer, Double> entry : derivatives.entrySet()) {
+ derivatives.put(entry.getKey(), a.value * entry.getValue());
+ }
+ for (Map.Entry<Integer, Double> entry : a.derivatives.entrySet()) {
+ final int id = entry.getKey();
+ final Double old = derivatives.get(id);
+ if (old == null) {
+ derivatives.put(id, value * entry.getValue());
+ } else {
+ derivatives.put(id, old + value * entry.getValue());
+ }
+ }
+ value *= a.value;
+ }
+
+ /** {@inheritDoc} */
+ public SparseGradient multiply(final double c) {
+ return new SparseGradient(value * c, c, derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient multiply(final int n) {
+ return new SparseGradient(value * n, n, derivatives);
+ }
+
+ /** {@inheritDoc} */
+ public SparseGradient divide(final SparseGradient a) {
+ final SparseGradient out = new SparseGradient(value / a.value, Collections.<Integer, Double> emptyMap());
+
+ // Derivatives.
+ for (Map.Entry<Integer, Double> entry : derivatives.entrySet()) {
+ out.derivatives.put(entry.getKey(), entry.getValue() / a.value);
+ }
+ for (Map.Entry<Integer, Double> entry : a.derivatives.entrySet()) {
+ final int id = entry.getKey();
+ final Double old = out.derivatives.get(id);
+ if (old == null) {
+ out.derivatives.put(id, -out.value / a.value * entry.getValue());
+ } else {
+ out.derivatives.put(id, old - out.value / a.value * entry.getValue());
+ }
+ }
+ return out;
+ }
+
+ /** {@inheritDoc} */
+ public SparseGradient divide(final double c) {
+ return new SparseGradient(value / c, 1.0 / c, derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient negate() {
+ return new SparseGradient(-value, -1.0, derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public Field<SparseGradient> getField() {
+ return new Field<SparseGradient>() {
+
+ /** {@inheritDoc} */
+ public SparseGradient getZero() {
+ return createConstant(0);
+ }
+
+ /** {@inheritDoc} */
+ public SparseGradient getOne() {
+ return createConstant(1);
+ }
+
+ /** {@inheritDoc} */
+ public Class<? extends FieldElement<SparseGradient>> getRuntimeClass() {
+ return SparseGradient.class;
+ }
+
+ };
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient remainder(final double a) {
+ return new SparseGradient(FastMath.IEEEremainder(value, a), derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient remainder(final SparseGradient a) {
+
+ // compute k such that lhs % rhs = lhs - k rhs
+ final double rem = FastMath.IEEEremainder(value, a.value);
+ final double k = FastMath.rint((value - rem) / a.value);
+
+ return subtract(a.multiply(k));
+
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient abs() {
+ if (Double.doubleToLongBits(value) < 0) {
+ // we use the bits representation to also handle -0.0
+ return negate();
+ } else {
+ return this;
+ }
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient ceil() {
+ return createConstant(FastMath.ceil(value));
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient floor() {
+ return createConstant(FastMath.floor(value));
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient rint() {
+ return createConstant(FastMath.rint(value));
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public long round() {
+ return FastMath.round(value);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient signum() {
+ return createConstant(FastMath.signum(value));
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient copySign(final SparseGradient sign) {
+ final long m = Double.doubleToLongBits(value);
+ final long s = Double.doubleToLongBits(sign.value);
+ if ((m >= 0 && s >= 0) || (m < 0 && s < 0)) { // Sign is currently OK
+ return this;
+ }
+ return negate(); // flip sign
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient copySign(final double sign) {
+ final long m = Double.doubleToLongBits(value);
+ final long s = Double.doubleToLongBits(sign);
+ if ((m >= 0 && s >= 0) || (m < 0 && s < 0)) { // Sign is currently OK
+ return this;
+ }
+ return negate(); // flip sign
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient scalb(final int n) {
+ final SparseGradient out = new SparseGradient(FastMath.scalb(value, n), Collections.<Integer, Double> emptyMap());
+ for (Map.Entry<Integer, Double> entry : derivatives.entrySet()) {
+ out.derivatives.put(entry.getKey(), FastMath.scalb(entry.getValue(), n));
+ }
+ return out;
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient hypot(final SparseGradient y) {
+ if (Double.isInfinite(value) || Double.isInfinite(y.value)) {
+ return createConstant(Double.POSITIVE_INFINITY);
+ } else if (Double.isNaN(value) || Double.isNaN(y.value)) {
+ return createConstant(Double.NaN);
+ } else {
+
+ final int expX = FastMath.getExponent(value);
+ final int expY = FastMath.getExponent(y.value);
+ if (expX > expY + 27) {
+ // y is neglectible with respect to x
+ return abs();
+ } else if (expY > expX + 27) {
+ // x is neglectible with respect to y
+ return y.abs();
+ } else {
+
+ // find an intermediate scale to avoid both overflow and underflow
+ final int middleExp = (expX + expY) / 2;
+
+ // scale parameters without losing precision
+ final SparseGradient scaledX = scalb(-middleExp);
+ final SparseGradient scaledY = y.scalb(-middleExp);
+
+ // compute scaled hypotenuse
+ final SparseGradient scaledH =
+ scaledX.multiply(scaledX).add(scaledY.multiply(scaledY)).sqrt();
+
+ // remove scaling
+ return scaledH.scalb(middleExp);
+
+ }
+
+ }
+ }
+
+ /**
+ * Returns the hypotenuse of a triangle with sides {@code x} and {@code y}
+ * - sqrt(<i>x</i><sup>2</sup> +<i>y</i><sup>2</sup>)<br/>
+ * avoiding intermediate overflow or underflow.
+ *
+ * <ul>
+ * <li> If either argument is infinite, then the result is positive infinity.</li>
+ * <li> else, if either argument is NaN then the result is NaN.</li>
+ * </ul>
+ *
+ * @param x a value
+ * @param y a value
+ * @return sqrt(<i>x</i><sup>2</sup> +<i>y</i><sup>2</sup>)
+ */
+ public static SparseGradient hypot(final SparseGradient x, final SparseGradient y) {
+ return x.hypot(y);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient reciprocal() {
+ return new SparseGradient(1.0 / value, -1.0 / (value * value), derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient sqrt() {
+ final double sqrt = FastMath.sqrt(value);
+ return new SparseGradient(sqrt, 0.5 / sqrt, derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient cbrt() {
+ final double cbrt = FastMath.cbrt(value);
+ return new SparseGradient(cbrt, 1.0 / (3 * cbrt * cbrt), derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient rootN(final int n) {
+ if (n == 2) {
+ return sqrt();
+ } else if (n == 3) {
+ return cbrt();
+ } else {
+ final double root = FastMath.pow(value, 1.0 / n);
+ return new SparseGradient(root, 1.0 / (n * FastMath.pow(root, n - 1)), derivatives);
+ }
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient pow(final double p) {
+ return new SparseGradient(FastMath.pow(value, p), p * FastMath.pow(value, p - 1), derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient pow(final int n) {
+ if (n == 0) {
+ return getField().getOne();
+ } else {
+ final double valueNm1 = FastMath.pow(value, n - 1);
+ return new SparseGradient(value * valueNm1, n * valueNm1, derivatives);
+ }
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient pow(final SparseGradient e) {
+ return log().multiply(e).exp();
+ }
+
+ /** Compute a<sup>x</sup> where a is a double and x a {@link SparseGradient}
+ * @param a number to exponentiate
+ * @param x power to apply
+ * @return a<sup>x</sup>
+ */
+ public static SparseGradient pow(final double a, final SparseGradient x) {
+ if (a == 0) {
+ if (x.value == 0) {
+ return x.compose(1.0, Double.NEGATIVE_INFINITY);
+ } else if (x.value < 0) {
+ return x.compose(Double.NaN, Double.NaN);
+ } else {
+ return x.getField().getZero();
+ }
+ } else {
+ final double ax = FastMath.pow(a, x.value);
+ return new SparseGradient(ax, ax * FastMath.log(a), x.derivatives);
+ }
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient exp() {
+ final double e = FastMath.exp(value);
+ return new SparseGradient(e, e, derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient expm1() {
+ return new SparseGradient(FastMath.expm1(value), FastMath.exp(value), derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient log() {
+ return new SparseGradient(FastMath.log(value), 1.0 / value, derivatives);
+ }
+
+ /** Base 10 logarithm.
+ * @return base 10 logarithm of the instance
+ */
+ public SparseGradient log10() {
+ return new SparseGradient(FastMath.log10(value), 1.0 / (FastMath.log(10.0) * value), derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient log1p() {
+ return new SparseGradient(FastMath.log1p(value), 1.0 / (1.0 + value), derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient cos() {
+ return new SparseGradient(FastMath.cos(value), -FastMath.sin(value), derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient sin() {
+ return new SparseGradient(FastMath.sin(value), FastMath.cos(value), derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient tan() {
+ final double t = FastMath.tan(value);
+ return new SparseGradient(t, 1 + t * t, derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient acos() {
+ return new SparseGradient(FastMath.acos(value), -1.0 / FastMath.sqrt(1 - value * value), derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient asin() {
+ return new SparseGradient(FastMath.asin(value), 1.0 / FastMath.sqrt(1 - value * value), derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient atan() {
+ return new SparseGradient(FastMath.atan(value), 1.0 / (1 + value * value), derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient atan2(final SparseGradient x) {
+
+ // compute r = sqrt(x^2+y^2)
+ final SparseGradient r = multiply(this).add(x.multiply(x)).sqrt();
+
+ final SparseGradient a;
+ if (x.value >= 0) {
+
+ // compute atan2(y, x) = 2 atan(y / (r + x))
+ a = divide(r.add(x)).atan().multiply(2);
+
+ } else {
+
+ // compute atan2(y, x) = +/- pi - 2 atan(y / (r - x))
+ final SparseGradient tmp = divide(r.subtract(x)).atan().multiply(-2);
+ a = tmp.add(tmp.value <= 0 ? -FastMath.PI : FastMath.PI);
+
+ }
+
+ // fix value to take special cases (+0/+0, +0/-0, -0/+0, -0/-0, +/-infinity) correctly
+ a.value = FastMath.atan2(value, x.value);
+
+ return a;
+
+ }
+
+ /** Two arguments arc tangent operation.
+ * @param y first argument of the arc tangent
+ * @param x second argument of the arc tangent
+ * @return atan2(y, x)
+ */
+ public static SparseGradient atan2(final SparseGradient y, final SparseGradient x) {
+ return y.atan2(x);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient cosh() {
+ return new SparseGradient(FastMath.cosh(value), FastMath.sinh(value), derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient sinh() {
+ return new SparseGradient(FastMath.sinh(value), FastMath.cosh(value), derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient tanh() {
+ final double t = FastMath.tanh(value);
+ return new SparseGradient(t, 1 - t * t, derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient acosh() {
+ return new SparseGradient(FastMath.acosh(value), 1.0 / FastMath.sqrt(value * value - 1.0), derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient asinh() {
+ return new SparseGradient(FastMath.asinh(value), 1.0 / FastMath.sqrt(value * value + 1.0), derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient atanh() {
+ return new SparseGradient(FastMath.atanh(value), 1.0 / (1.0 - value * value), derivatives);
+ }
+
+ /** Convert radians to degrees, with error of less than 0.5 ULP
+ * @return instance converted into degrees
+ */
+ public SparseGradient toDegrees() {
+ return new SparseGradient(FastMath.toDegrees(value), FastMath.toDegrees(1.0), derivatives);
+ }
+
+ /** Convert degrees to radians, with error of less than 0.5 ULP
+ * @return instance converted into radians
+ */
+ public SparseGradient toRadians() {
+ return new SparseGradient(FastMath.toRadians(value), FastMath.toRadians(1.0), derivatives);
+ }
+
+ /** Evaluate Taylor expansion of a sparse gradient.
+ * @param delta parameters offsets (Δx, Δy, ...)
+ * @return value of the Taylor expansion at x + Δx, y + Δy, ...
+ */
+ public double taylor(final double ... delta) {
+ double y = value;
+ for (int i = 0; i < delta.length; ++i) {
+ y += delta[i] * getDerivative(i);
+ }
+ return y;
+ }
+
+ /** Compute composition of the instance by a univariate function.
+ * @param f0 value of the function at (i.e. f({@link #getValue()}))
+ * @param f1 first derivative of the function at
+ * the current point (i.e. f'({@link #getValue()}))
+ * @return f(this)
+ */
+ public SparseGradient compose(final double f0, final double f1) {
+ return new SparseGradient(f0, f1, derivatives);
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient linearCombination(final SparseGradient[] a,
+ final SparseGradient[] b)
+ throws DimensionMismatchException {
+
+ // compute a simple value, with all partial derivatives
+ SparseGradient out = a[0].getField().getZero();
+ for (int i = 0; i < a.length; ++i) {
+ out = out.add(a[i].multiply(b[i]));
+ }
+
+ // recompute an accurate value, taking care of cancellations
+ final double[] aDouble = new double[a.length];
+ for (int i = 0; i < a.length; ++i) {
+ aDouble[i] = a[i].getValue();
+ }
+ final double[] bDouble = new double[b.length];
+ for (int i = 0; i < b.length; ++i) {
+ bDouble[i] = b[i].getValue();
+ }
+ out.value = MathArrays.linearCombination(aDouble, bDouble);
+
+ return out;
+
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient linearCombination(final double[] a, final SparseGradient[] b) {
+
+ // compute a simple value, with all partial derivatives
+ SparseGradient out = b[0].getField().getZero();
+ for (int i = 0; i < a.length; ++i) {
+ out = out.add(b[i].multiply(a[i]));
+ }
+
+ // recompute an accurate value, taking care of cancellations
+ final double[] bDouble = new double[b.length];
+ for (int i = 0; i < b.length; ++i) {
+ bDouble[i] = b[i].getValue();
+ }
+ out.value = MathArrays.linearCombination(a, bDouble);
+
+ return out;
+
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient linearCombination(final SparseGradient a1, final SparseGradient b1,
+ final SparseGradient a2, final SparseGradient b2) {
+
+ // compute a simple value, with all partial derivatives
+ SparseGradient out = a1.multiply(b1).add(a2.multiply(b2));
+
+ // recompute an accurate value, taking care of cancellations
+ out.value = MathArrays.linearCombination(a1.value, b1.value, a2.value, b2.value);
+
+ return out;
+
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient linearCombination(final double a1, final SparseGradient b1,
+ final double a2, final SparseGradient b2) {
+
+ // compute a simple value, with all partial derivatives
+ SparseGradient out = b1.multiply(a1).add(b2.multiply(a2));
+
+ // recompute an accurate value, taking care of cancellations
+ out.value = MathArrays.linearCombination(a1, b1.value, a2, b2.value);
+
+ return out;
+
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient linearCombination(final SparseGradient a1, final SparseGradient b1,
+ final SparseGradient a2, final SparseGradient b2,
+ final SparseGradient a3, final SparseGradient b3) {
+
+ // compute a simple value, with all partial derivatives
+ SparseGradient out = a1.multiply(b1).add(a2.multiply(b2)).add(a3.multiply(b3));
+
+ // recompute an accurate value, taking care of cancellations
+ out.value = MathArrays.linearCombination(a1.value, b1.value,
+ a2.value, b2.value,
+ a3.value, b3.value);
+
+ return out;
+
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient linearCombination(final double a1, final SparseGradient b1,
+ final double a2, final SparseGradient b2,
+ final double a3, final SparseGradient b3) {
+
+ // compute a simple value, with all partial derivatives
+ SparseGradient out = b1.multiply(a1).add(b2.multiply(a2)).add(b3.multiply(a3));
+
+ // recompute an accurate value, taking care of cancellations
+ out.value = MathArrays.linearCombination(a1, b1.value,
+ a2, b2.value,
+ a3, b3.value);
+
+ return out;
+
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient linearCombination(final SparseGradient a1, final SparseGradient b1,
+ final SparseGradient a2, final SparseGradient b2,
+ final SparseGradient a3, final SparseGradient b3,
+ final SparseGradient a4, final SparseGradient b4) {
+
+ // compute a simple value, with all partial derivatives
+ SparseGradient out = a1.multiply(b1).add(a2.multiply(b2)).add(a3.multiply(b3)).add(a4.multiply(b4));
+
+ // recompute an accurate value, taking care of cancellations
+ out.value = MathArrays.linearCombination(a1.value, b1.value,
+ a2.value, b2.value,
+ a3.value, b3.value,
+ a4.value, b4.value);
+
+ return out;
+
+ }
+
+ /** {@inheritDoc} */
+ @Override
+ public SparseGradient linearCombination(final double a1, final SparseGradient b1,
+ final double a2, final SparseGradient b2,
+ final double a3, final SparseGradient b3,
+ final double a4, final SparseGradient b4) {
+
+ // compute a simple value, with all partial derivatives
+ SparseGradient out = b1.multiply(a1).add(b2.multiply(a2)).add(b3.multiply(a3)).add(b4.multiply(a4));
+
+ // recompute an accurate value, taking care of cancellations
+ out.value = MathArrays.linearCombination(a1, b1.value,
+ a2, b2.value,
+ a3, b3.value,
+ a4, b4.value);
+
+ return out;
+
+ }
+
+ /**
+ * Test for the equality of two sparse gradients.
+ * <p>
+ * Sparse gradients are considered equal if they have the same value
+ * and the same derivatives.
+ * </p>
+ * @param other Object to test for equality to this
+ * @return true if two sparse gradients are equal
+ */
+ @Override
+ public boolean equals(Object other) {
+
+ if (this == other) {
+ return true;
+ }
+
+ if (other instanceof SparseGradient) {
+ final SparseGradient rhs = (SparseGradient)other;
+ if (!Precision.equals(value, rhs.value, 1)) {
+ return false;
+ }
+ if (derivatives.size() != rhs.derivatives.size()) {
+ return false;
+ }
+ for (final Map.Entry<Integer, Double> entry : derivatives.entrySet()) {
+ if (!rhs.derivatives.containsKey(entry.getKey())) {
+ return false;
+ }
+ if (!Precision.equals(entry.getValue(), rhs.derivatives.get(entry.getKey()), 1)) {
+ return false;
+ }
+ }
+ return true;
+ }
+
+ return false;
+
+ }
+
+ /**
+ * Get a hashCode for the derivative structure.
+ * @return a hash code value for this object
+ * @since 3.2
+ */
+ @Override
+ public int hashCode() {
+ return 743 + 809 * + 233 * MathUtils.hash(value) + 167 * derivatives.hashCode();
+ }
+
+}
Propchange: commons/proper/math/trunk/src/main/java/org/apache/commons/math3/analysis/differentiation/SparseGradient.java
------------------------------------------------------------------------------
svn:eol-style = native
Propchange: commons/proper/math/trunk/src/main/java/org/apache/commons/math3/analysis/differentiation/SparseGradient.java
------------------------------------------------------------------------------
svn:keywords = "Author Date Id Revision"
Added: commons/proper/math/trunk/src/test/java/org/apache/commons/math3/analysis/differentiation/SparseGradientTest.java
URL: http://svn.apache.org/viewvc/commons/proper/math/trunk/src/test/java/org/apache/commons/math3/analysis/differentiation/SparseGradientTest.java?rev=1536073&view=auto
==============================================================================
--- commons/proper/math/trunk/src/test/java/org/apache/commons/math3/analysis/differentiation/SparseGradientTest.java (added)
+++ commons/proper/math/trunk/src/test/java/org/apache/commons/math3/analysis/differentiation/SparseGradientTest.java Sun Oct 27 09:52:44 2013
@@ -0,0 +1,1128 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements. See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+package org.apache.commons.math3.analysis.differentiation;
+
+import java.util.Arrays;
+import java.util.List;
+
+import org.apache.commons.math3.ExtendedFieldElementAbstractTest;
+import org.apache.commons.math3.TestUtils;
+import org.apache.commons.math3.analysis.polynomials.PolynomialFunction;
+import org.apache.commons.math3.random.Well1024a;
+import org.apache.commons.math3.util.FastMath;
+import org.junit.Assert;
+import org.junit.Test;
+
+public class SparseGradientTest extends ExtendedFieldElementAbstractTest<SparseGradient> {
+
+ @Override
+ protected SparseGradient build(final double x) {
+ return SparseGradient.createVariable(0, x);
+ }
+
+ @Test
+ public void testConstant() {
+ double c = 1.0;
+ SparseGradient grad = SparseGradient.createConstant(c);
+ Assert.assertEquals(c, grad.getValue(), 1.0e-15); // returns the value
+ Assert.assertEquals(0, grad.numVars(), 1.0e-15); // has no variables
+ }
+
+ @Test
+ public void testVariable() {
+ double v = 1.0;
+ int id = 0;
+ SparseGradient grad = SparseGradient.createVariable(id, v);
+ Assert.assertEquals(v, grad.getValue(), 1.0e-15); // returns the value
+ Assert.assertEquals(1, grad.numVars(), 1.0e-15); // has one variable
+ Assert.assertEquals(1.0, grad.getDerivative(id), 1.0e-15); // derivative wr.t itself is 1
+ }
+
+ @Test
+ public void testVarAddition() {
+ final double v1 = 1.0;
+ final double v2 = 2.0;
+ final int id1 = -1;
+ final int id2 = 3;
+ final SparseGradient var1 = SparseGradient.createVariable(id1, v1);
+ final SparseGradient var2 = SparseGradient.createVariable(id2, v2);
+ final SparseGradient sum = var1.add(var2);
+
+ Assert.assertEquals(v1 + v2, sum.getValue(), 1.0e-15); // returns the value
+ Assert.assertEquals(2, sum.numVars());
+ Assert.assertEquals(1.0, sum.getDerivative(id1), 1.0e-15);
+ Assert.assertEquals(1.0, sum.getDerivative(id2), 1.0e-15);
+ }
+
+ @Test
+ public void testSubtraction() {
+ final double v1 = 1.0;
+ final double v2 = 2.0;
+ final int id1 = -1;
+ final int id2 = 3;
+ final SparseGradient var1 = SparseGradient.createVariable(id1, v1);
+ final SparseGradient var2 = SparseGradient.createVariable(id2, v2);
+ final SparseGradient sum = var1.subtract(var2);
+
+ Assert.assertEquals(v1 - v2, sum.getValue(), 1.0e-15); // returns the value
+ Assert.assertEquals(2, sum.numVars());
+ Assert.assertEquals(1.0, sum.getDerivative(id1), 1.0e-15);
+ Assert.assertEquals(-1.0, sum.getDerivative(id2), 1.0e-15);
+ }
+
+ @Test
+ public void testDivision() {
+ final double v1 = 1.0;
+ final double v2 = 2.0;
+ final int id1 = -1;
+ final int id2 = 3;
+ final SparseGradient var1 = SparseGradient.createVariable(id1, v1);
+ final SparseGradient var2 = SparseGradient.createVariable(id2, v2);
+ final SparseGradient out = var1.divide(var2);
+ Assert.assertEquals(v1 / v2, out.getValue(), 1.0e-15); // returns the value
+ Assert.assertEquals(2, out.numVars());
+ Assert.assertEquals(1 / v2, out.getDerivative(id1), 1.0e-15);
+ Assert.assertEquals(-1 / (v2 * v2), out.getDerivative(id2), 1.0e-15);
+ }
+
+ @Test
+ public void testMult() {
+ final double v1 = 1.0;
+ final double c1 = 0.5;
+ final double v2 = 2.0;
+ final int id1 = -1;
+ final int id2 = 3;
+ final SparseGradient var1 = SparseGradient.createVariable(id1, v1);
+ final SparseGradient unit1 = var1.multiply(c1);
+ final SparseGradient unit2 = SparseGradient.createVariable(id2, v2).multiply(var1);
+ final SparseGradient sum = unit1.add(unit2);
+ Assert.assertEquals(v1 * c1 + v2 * v1, sum.getValue(), 1.0e-15); // returns the value
+ Assert.assertEquals(2, sum.numVars());
+ Assert.assertEquals(c1 + v2, sum.getDerivative(id1), 1.0e-15);
+ Assert.assertEquals(v1, sum.getDerivative(id2), 1.0e-15);
+ }
+
+ @Test
+ public void testVarMultInPlace() {
+ final double v1 = 1.0;
+ final double c1 = 0.5;
+ final double v2 = 2.0;
+ final int id1 = -1;
+ final int id2 = 3;
+ final SparseGradient var1 = SparseGradient.createVariable(id1, v1);
+ final SparseGradient sum = var1.multiply(c1);
+ final SparseGradient mult = SparseGradient.createVariable(id2, v2);
+ mult.multInPlace(var1);
+ sum.addInPlace(mult);
+ Assert.assertEquals(v1 * c1 + v2 * v1, sum.getValue(), 1.0e-15); // returns the value
+ Assert.assertEquals(2, sum.numVars());
+ Assert.assertEquals(c1 + v2, sum.getDerivative(id1), 1.0e-15);
+ Assert.assertEquals(v1, sum.getDerivative(id2), 1.0e-15);
+ }
+
+ @Test
+ public void testPrimitiveAdd() {
+ checkF0F1(SparseGradient.createVariable(0, 1.0).add(5), 6.0, 1.0, 0.0, 0.0);
+ checkF0F1(SparseGradient.createVariable(1, 2.0).add(5), 7.0, 0.0, 1.0, 0.0);
+ checkF0F1(SparseGradient.createVariable(2, 3.0).add(5), 8.0, 0.0, 0.0, 1.0);
+ }
+
+ @Test
+ public void testAdd() {
+ SparseGradient x = SparseGradient.createVariable(0, 1.0);
+ SparseGradient y = SparseGradient.createVariable(1, 2.0);
+ SparseGradient z = SparseGradient.createVariable(2, 3.0);
+ SparseGradient xyz = x.add(y.add(z));
+ checkF0F1(xyz, x.getValue() + y.getValue() + z.getValue(), 1.0, 1.0, 1.0);
+ }
+
+ @Test
+ public void testPrimitiveSubtract() {
+ checkF0F1(SparseGradient.createVariable(0, 1.0).subtract(5), -4.0, 1.0, 0.0, 0.0);
+ checkF0F1(SparseGradient.createVariable(1, 2.0).subtract(5), -3.0, 0.0, 1.0, 0.0);
+ checkF0F1(SparseGradient.createVariable(2, 3.0).subtract(5), -2.0, 0.0, 0.0, 1.0);
+ }
+
+ @Test
+ public void testSubtract() {
+ SparseGradient x = SparseGradient.createVariable(0, 1.0);
+ SparseGradient y = SparseGradient.createVariable(1, 2.0);
+ SparseGradient z = SparseGradient.createVariable(2, 3.0);
+ SparseGradient xyz = x.subtract(y.subtract(z));
+ checkF0F1(xyz, x.getValue() - (y.getValue() - z.getValue()), 1.0, -1.0, 1.0);
+ }
+
+ @Test
+ public void testPrimitiveMultiply() {
+ checkF0F1(SparseGradient.createVariable(0, 1.0).multiply(5), 5.0, 5.0, 0.0, 0.0);
+ checkF0F1(SparseGradient.createVariable(1, 2.0).multiply(5), 10.0, 0.0, 5.0, 0.0);
+ checkF0F1(SparseGradient.createVariable(2, 3.0).multiply(5), 15.0, 0.0, 0.0, 5.0);
+ }
+
+ @Test
+ public void testMultiply() {
+ SparseGradient x = SparseGradient.createVariable(0, 1.0);
+ SparseGradient y = SparseGradient.createVariable(1, 2.0);
+ SparseGradient z = SparseGradient.createVariable(2, 3.0);
+ SparseGradient xyz = x.multiply(y.multiply(z));
+ checkF0F1(xyz, 6.0, 6.0, 3.0, 2.0);
+ }
+
+ @Test
+ public void testNegate() {
+ checkF0F1(SparseGradient.createVariable(0, 1.0).negate(), -1.0, -1.0, 0.0, 0.0);
+ checkF0F1(SparseGradient.createVariable(1, 2.0).negate(), -2.0, 0.0, -1.0, 0.0);
+ checkF0F1(SparseGradient.createVariable(2, 3.0).negate(), -3.0, 0.0, 0.0, -1.0);
+ }
+
+ @Test
+ public void testReciprocal() {
+ for (double x = 0.1; x < 1.2; x += 0.1) {
+ SparseGradient r = SparseGradient.createVariable(0, x).reciprocal();
+ Assert.assertEquals(1 / x, r.getValue(), 1.0e-15);
+ final double expected = -1 / (x * x);
+ Assert.assertEquals(expected, r.getDerivative(0), 1.0e-15 * FastMath.abs(expected));
+ }
+ }
+
+ @Test
+ public void testPow() {
+ for (int n = 0; n < 10; ++n) {
+
+ SparseGradient x = SparseGradient.createVariable(0, 1.0);
+ SparseGradient y = SparseGradient.createVariable(1, 2.0);
+ SparseGradient z = SparseGradient.createVariable(2, 3.0);
+ List<SparseGradient> list = Arrays.asList(x, y, z,
+ x.add(y).add(z),
+ x.multiply(y).multiply(z));
+
+ if (n == 0) {
+ for (SparseGradient sg : list) {
+ Assert.assertEquals(sg.getField().getOne(), sg.pow(n));
+ }
+ } else if (n == 1) {
+ for (SparseGradient sg : list) {
+ Assert.assertEquals(sg, sg.pow(n));
+ }
+ } else {
+ for (SparseGradient sg : list) {
+ SparseGradient p = sg.getField().getOne();
+ for (int i = 0; i < n; ++i) {
+ p = p.multiply(sg);
+ }
+ Assert.assertEquals(p, sg.pow(n));
+ }
+ }
+ }
+ }
+
+ @Test
+ public void testPowDoubleDS() {
+ for (int maxOrder = 1; maxOrder < 5; ++maxOrder) {
+
+ SparseGradient x = SparseGradient.createVariable(0, 0.1);
+ SparseGradient y = SparseGradient.createVariable(1, 0.2);
+ SparseGradient z = SparseGradient.createVariable(2, 0.3);
+ List<SparseGradient> list = Arrays.asList(x, y, z,
+ x.add(y).add(z),
+ x.multiply(y).multiply(z));
+
+ for (SparseGradient sg : list) {
+ // the special case a = 0 is included here
+ for (double a : new double[] { 0.0, 0.1, 1.0, 2.0, 5.0 }) {
+ SparseGradient reference = (a == 0) ?
+ x.getField().getZero() :
+ SparseGradient.createConstant(a).pow(sg);
+ SparseGradient result = SparseGradient.pow(a, sg);
+ Assert.assertEquals(reference, result);
+ }
+
+ }
+
+ // negative base: -1^x can be evaluated for integers only, so value is sometimes OK, derivatives are always NaN
+ SparseGradient negEvenInteger = SparseGradient.pow(-2.0, SparseGradient.createVariable(0, 2.0));
+ Assert.assertEquals(4.0, negEvenInteger.getValue(), 1.0e-15);
+ Assert.assertTrue(Double.isNaN(negEvenInteger.getDerivative(0)));
+ SparseGradient negOddInteger = SparseGradient.pow(-2.0, SparseGradient.createVariable(0, 3.0));
+ Assert.assertEquals(-8.0, negOddInteger.getValue(), 1.0e-15);
+ Assert.assertTrue(Double.isNaN(negOddInteger.getDerivative(0)));
+ SparseGradient negNonInteger = SparseGradient.pow(-2.0, SparseGradient.createVariable(0, 2.001));
+ Assert.assertTrue(Double.isNaN(negNonInteger.getValue()));
+ Assert.assertTrue(Double.isNaN(negNonInteger.getDerivative(0)));
+
+ SparseGradient zeroNeg = SparseGradient.pow(0.0, SparseGradient.createVariable(0, -1.0));
+ Assert.assertTrue(Double.isNaN(zeroNeg.getValue()));
+ Assert.assertTrue(Double.isNaN(zeroNeg.getDerivative(0)));
+ SparseGradient posNeg = SparseGradient.pow(2.0, SparseGradient.createVariable(0, -2.0));
+ Assert.assertEquals(1.0 / 4.0, posNeg.getValue(), 1.0e-15);
+ Assert.assertEquals(FastMath.log(2.0) / 4.0, posNeg.getDerivative(0), 1.0e-15);
+
+ // very special case: a = 0 and power = 0
+ SparseGradient zeroZero = SparseGradient.pow(0.0, SparseGradient.createVariable(0, 0.0));
+
+ // this should be OK for simple first derivative with one variable only ...
+ Assert.assertEquals(1.0, zeroZero.getValue(), 1.0e-15);
+ Assert.assertEquals(Double.NEGATIVE_INFINITY, zeroZero.getDerivative(0), 1.0e-15);
+ Assert.assertEquals(0.0, zeroZero.getDerivative(1), 1.0e-15);
+ Assert.assertEquals(0.0, zeroZero.getDerivative(2), 1.0e-15);
+
+ }
+
+ }
+
+ @Test
+ public void testExpression() {
+ double epsilon = 2.5e-13;
+ for (double x = 0; x < 2; x += 0.2) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ for (double y = 0; y < 2; y += 0.2) {
+ SparseGradient sgY = SparseGradient.createVariable(1, y);
+ for (double z = 0; z >- 2; z -= 0.2) {
+ SparseGradient sgZ = SparseGradient.createVariable(2, z);
+
+ // f(x, y, z) = x + 5 x y - 2 z + (8 z x - y)^3
+ SparseGradient sg =
+ sgZ.linearCombination(1, sgX,
+ 5, sgX.multiply(sgY),
+ -2, sgZ,
+ 1, sgZ.linearCombination(8, sgZ.multiply(sgX), -1, sgY).pow(3));
+ double f = x + 5 * x * y - 2 * z + FastMath.pow(8 * z * x - y, 3);
+ Assert.assertEquals(f, sg.getValue(), FastMath.abs(epsilon * f));
+
+ // df/dx = 1 + 5 y + 24 (8 z x - y)^2 z
+ double dfdx = 1 + 5 * y + 24 * z * FastMath.pow(8 * z * x - y, 2);
+ Assert.assertEquals(dfdx, sg.getDerivative(0), FastMath.abs(epsilon * dfdx));
+
+ }
+
+ }
+ }
+ }
+
+ @Test
+ public void testCompositionOneVariableX() {
+ double epsilon = 1.0e-13;
+ for (double x = 0.1; x < 1.2; x += 0.1) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ for (double y = 0.1; y < 1.2; y += 0.1) {
+ SparseGradient sgY = SparseGradient.createConstant(y);
+ SparseGradient f = sgX.divide(sgY).sqrt();
+ double f0 = FastMath.sqrt(x / y);
+ Assert.assertEquals(f0, f.getValue(), FastMath.abs(epsilon * f0));
+ double f1 = 1 / (2 * FastMath.sqrt(x * y));
+ Assert.assertEquals(f1, f.getDerivative(0), FastMath.abs(epsilon * f1));
+ }
+ }
+ }
+
+ @Test
+ public void testTrigo() {
+ double epsilon = 2.0e-12;
+ for (double x = 0.1; x < 1.2; x += 0.1) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ for (double y = 0.1; y < 1.2; y += 0.1) {
+ SparseGradient sgY = SparseGradient.createVariable(1, y);
+ for (double z = 0.1; z < 1.2; z += 0.1) {
+ SparseGradient sgZ = SparseGradient.createVariable(2, z);
+ SparseGradient f = sgX.divide(sgY.cos().add(sgZ.tan())).sin();
+ double a = FastMath.cos(y) + FastMath.tan(z);
+ double f0 = FastMath.sin(x / a);
+ Assert.assertEquals(f0, f.getValue(), FastMath.abs(epsilon * f0));
+ double dfdx = FastMath.cos(x / a) / a;
+ Assert.assertEquals(dfdx, f.getDerivative(0), FastMath.abs(epsilon * dfdx));
+ double dfdy = x * FastMath.sin(y) * dfdx / a;
+ Assert.assertEquals(dfdy, f.getDerivative(1), FastMath.abs(epsilon * dfdy));
+ double cz = FastMath.cos(z);
+ double cz2 = cz * cz;
+ double dfdz = -x * dfdx / (a * cz2);
+ Assert.assertEquals(dfdz, f.getDerivative(2), FastMath.abs(epsilon * dfdz));
+ }
+ }
+ }
+ }
+
+ @Test
+ public void testSqrtDefinition() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient sqrt1 = sgX.pow(0.5);
+ SparseGradient sqrt2 = sgX.sqrt();
+ SparseGradient zero = sqrt1.subtract(sqrt2);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Test
+ public void testRootNSingularity() {
+ for (int n = 2; n < 10; ++n) {
+ SparseGradient sgZero = SparseGradient.createVariable(0, 0.0);
+ SparseGradient rootN = sgZero.rootN(n);
+ Assert.assertEquals(0.0, rootN.getValue(), 1.0e-5);
+ Assert.assertTrue(Double.isInfinite(rootN.getDerivative(0)));
+ Assert.assertTrue(rootN.getDerivative(0) > 0);
+ }
+
+ }
+
+ @Test
+ public void testSqrtPow2() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient rebuiltX = sgX.multiply(sgX).sqrt();
+ SparseGradient zero = rebuiltX.subtract(sgX);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Test
+ public void testCbrtDefinition() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient cbrt1 = sgX.pow(1.0 / 3.0);
+ SparseGradient cbrt2 = sgX.cbrt();
+ SparseGradient zero = cbrt1.subtract(cbrt2);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Test
+ public void testCbrtPow3() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient rebuiltX = sgX.multiply(sgX.multiply(sgX)).cbrt();
+ SparseGradient zero = rebuiltX.subtract(sgX);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Test
+ public void testPowReciprocalPow() {
+ for (double x = 0.1; x < 1.2; x += 0.01) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ for (double y = 0.1; y < 1.2; y += 0.01) {
+ SparseGradient sgY = SparseGradient.createVariable(1, y);
+ SparseGradient rebuiltX = sgX.pow(sgY).pow(sgY.reciprocal());
+ SparseGradient zero = rebuiltX.subtract(sgX);
+ checkF0F1(zero, 0.0, 0.0, 0.0);
+ }
+ }
+ }
+
+ @Test
+ public void testHypotDefinition() {
+ for (double x = -1.7; x < 2; x += 0.2) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ for (double y = -1.7; y < 2; y += 0.2) {
+ SparseGradient sgY = SparseGradient.createVariable(1, y);
+ SparseGradient hypot = SparseGradient.hypot(sgY, sgX);
+ SparseGradient ref = sgX.multiply(sgX).add(sgY.multiply(sgY)).sqrt();
+ SparseGradient zero = hypot.subtract(ref);
+ checkF0F1(zero, 0.0, 0.0, 0.0);
+
+ }
+ }
+ }
+
+ @Test
+ public void testHypotNoOverflow() {
+
+ SparseGradient sgX = SparseGradient.createVariable(0, +3.0e250);
+ SparseGradient sgY = SparseGradient.createVariable(1, -4.0e250);
+ SparseGradient hypot = SparseGradient.hypot(sgX, sgY);
+ Assert.assertEquals(5.0e250, hypot.getValue(), 1.0e235);
+ Assert.assertEquals(sgX.getValue() / hypot.getValue(), hypot.getDerivative(0), 1.0e-10);
+ Assert.assertEquals(sgY.getValue() / hypot.getValue(), hypot.getDerivative(1), 1.0e-10);
+
+ SparseGradient sqrt = sgX.multiply(sgX).add(sgY.multiply(sgY)).sqrt();
+ Assert.assertTrue(Double.isInfinite(sqrt.getValue()));
+
+ }
+
+ @Test
+ public void testHypotNeglectible() {
+
+ SparseGradient sgSmall = SparseGradient.createVariable(0, +3.0e-10);
+ SparseGradient sgLarge = SparseGradient.createVariable(1, -4.0e25);
+
+ Assert.assertEquals(sgLarge.abs().getValue(),
+ SparseGradient.hypot(sgSmall, sgLarge).getValue(),
+ 1.0e-10);
+ Assert.assertEquals(0,
+ SparseGradient.hypot(sgSmall, sgLarge).getDerivative(0),
+ 1.0e-10);
+ Assert.assertEquals(-1,
+ SparseGradient.hypot(sgSmall, sgLarge).getDerivative(1),
+ 1.0e-10);
+
+ Assert.assertEquals(sgLarge.abs().getValue(),
+ SparseGradient.hypot(sgLarge, sgSmall).getValue(),
+ 1.0e-10);
+ Assert.assertEquals(0,
+ SparseGradient.hypot(sgLarge, sgSmall).getDerivative(0),
+ 1.0e-10);
+ Assert.assertEquals(-1,
+ SparseGradient.hypot(sgLarge, sgSmall).getDerivative(1),
+ 1.0e-10);
+
+ }
+
+ @Test
+ public void testHypotSpecial() {
+ Assert.assertTrue(Double.isNaN(SparseGradient.hypot(SparseGradient.createVariable(0, Double.NaN),
+ SparseGradient.createVariable(0, +3.0e250)).getValue()));
+ Assert.assertTrue(Double.isNaN(SparseGradient.hypot(SparseGradient.createVariable(0, +3.0e250),
+ SparseGradient.createVariable(0, Double.NaN)).getValue()));
+ Assert.assertTrue(Double.isInfinite(SparseGradient.hypot(SparseGradient.createVariable(0, Double.POSITIVE_INFINITY),
+ SparseGradient.createVariable(0, +3.0e250)).getValue()));
+ Assert.assertTrue(Double.isInfinite(SparseGradient.hypot(SparseGradient.createVariable(0, +3.0e250),
+ SparseGradient.createVariable(0, Double.POSITIVE_INFINITY)).getValue()));
+ }
+
+ @Test
+ public void testPrimitiveRemainder() {
+ for (double x = -1.7; x < 2; x += 0.2) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ for (double y = -1.7; y < 2; y += 0.2) {
+ SparseGradient remainder = sgX.remainder(y);
+ SparseGradient ref = sgX.subtract(x - FastMath.IEEEremainder(x, y));
+ SparseGradient zero = remainder.subtract(ref);
+ checkF0F1(zero, 0.0, 0.0, 0.0);
+ }
+ }
+ }
+
+ @Test
+ public void testRemainder() {
+ for (double x = -1.7; x < 2; x += 0.2) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ for (double y = -1.7; y < 2; y += 0.2) {
+ SparseGradient sgY = SparseGradient.createVariable(1, y);
+ SparseGradient remainder = sgX.remainder(sgY);
+ SparseGradient ref = sgX.subtract(sgY.multiply((x - FastMath.IEEEremainder(x, y)) / y));
+ SparseGradient zero = remainder.subtract(ref);
+ checkF0F1(zero, 0.0, 0.0, 0.0);
+ }
+ }
+ }
+
+ @Override
+ @Test
+ public void testExp() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ double refExp = FastMath.exp(x);
+ checkF0F1(SparseGradient.createVariable(0, x).exp(), refExp, refExp);
+ }
+ }
+
+ @Test
+ public void testExpm1Definition() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient expm11 = sgX.expm1();
+ SparseGradient expm12 = sgX.exp().subtract(sgX.getField().getOne());
+ SparseGradient zero = expm11.subtract(expm12);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Override
+ @Test
+ public void testLog() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ checkF0F1(SparseGradient.createVariable(0, x).log(), FastMath.log(x), 1.0 / x);
+ }
+ }
+
+ @Test
+ public void testLog1pDefinition() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient log1p1 = sgX.log1p();
+ SparseGradient log1p2 = sgX.add(sgX.getField().getOne()).log();
+ SparseGradient zero = log1p1.subtract(log1p2);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Test
+ public void testLog10Definition() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient log101 = sgX.log10();
+ SparseGradient log102 = sgX.log().divide(FastMath.log(10.0));
+ SparseGradient zero = log101.subtract(log102);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Test
+ public void testLogExp() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient rebuiltX = sgX.exp().log();
+ SparseGradient zero = rebuiltX.subtract(sgX);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Test
+ public void testLog1pExpm1() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient rebuiltX = sgX.expm1().log1p();
+ SparseGradient zero = rebuiltX.subtract(sgX);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Test
+ public void testLog10Power() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient rebuiltX = SparseGradient.pow(10.0, sgX).log10();
+ SparseGradient zero = rebuiltX.subtract(sgX);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Test
+ public void testSinCos() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient sin = sgX.sin();
+ SparseGradient cos = sgX.cos();
+ double s = FastMath.sin(x);
+ double c = FastMath.cos(x);
+ checkF0F1(sin, s, c);
+ checkF0F1(cos, c, -s);
+ }
+ }
+
+ @Test
+ public void testSinAsin() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient rebuiltX = sgX.sin().asin();
+ SparseGradient zero = rebuiltX.subtract(sgX);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Test
+ public void testCosAcos() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient rebuiltX = sgX.cos().acos();
+ SparseGradient zero = rebuiltX.subtract(sgX);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Test
+ public void testTanAtan() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient rebuiltX = sgX.tan().atan();
+ SparseGradient zero = rebuiltX.subtract(sgX);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Test
+ public void testTangentDefinition() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient tan1 = sgX.sin().divide(sgX.cos());
+ SparseGradient tan2 = sgX.tan();
+ SparseGradient zero = tan1.subtract(tan2);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Override
+ @Test
+ public void testAtan2() {
+ for (double x = -1.7; x < 2; x += 0.2) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ for (double y = -1.7; y < 2; y += 0.2) {
+ SparseGradient sgY = SparseGradient.createVariable(1, y);
+ SparseGradient atan2 = SparseGradient.atan2(sgY, sgX);
+ SparseGradient ref = sgY.divide(sgX).atan();
+ if (x < 0) {
+ ref = (y < 0) ? ref.subtract(FastMath.PI) : ref.add(FastMath.PI);
+ }
+ SparseGradient zero = atan2.subtract(ref);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+ }
+
+ @Test
+ public void testAtan2SpecialCases() {
+
+ SparseGradient pp =
+ SparseGradient.atan2(SparseGradient.createVariable(1, +0.0),
+ SparseGradient.createVariable(1, +0.0));
+ Assert.assertEquals(0, pp.getValue(), 1.0e-15);
+ Assert.assertEquals(+1, FastMath.copySign(1, pp.getValue()), 1.0e-15);
+
+ SparseGradient pn =
+ SparseGradient.atan2(SparseGradient.createVariable(1, +0.0),
+ SparseGradient.createVariable(1, -0.0));
+ Assert.assertEquals(FastMath.PI, pn.getValue(), 1.0e-15);
+
+ SparseGradient np =
+ SparseGradient.atan2(SparseGradient.createVariable(1, -0.0),
+ SparseGradient.createVariable(1, +0.0));
+ Assert.assertEquals(0, np.getValue(), 1.0e-15);
+ Assert.assertEquals(-1, FastMath.copySign(1, np.getValue()), 1.0e-15);
+
+ SparseGradient nn =
+ SparseGradient.atan2(SparseGradient.createVariable(1, -0.0),
+ SparseGradient.createVariable(1, -0.0));
+ Assert.assertEquals(-FastMath.PI, nn.getValue(), 1.0e-15);
+
+ }
+
+ @Test
+ public void testSinhDefinition() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient sinh1 = sgX.exp().subtract(sgX.exp().reciprocal()).multiply(0.5);
+ SparseGradient sinh2 = sgX.sinh();
+ SparseGradient zero = sinh1.subtract(sinh2);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Test
+ public void testCoshDefinition() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient cosh1 = sgX.exp().add(sgX.exp().reciprocal()).multiply(0.5);
+ SparseGradient cosh2 = sgX.cosh();
+ SparseGradient zero = cosh1.subtract(cosh2);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Test
+ public void testTanhDefinition() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient tanh1 = sgX.exp().subtract(sgX.exp().reciprocal()).divide(sgX.exp().add(sgX.exp().reciprocal()));
+ SparseGradient tanh2 = sgX.tanh();
+ SparseGradient zero = tanh1.subtract(tanh2);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Test
+ public void testSinhAsinh() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient rebuiltX = sgX.sinh().asinh();
+ SparseGradient zero = rebuiltX.subtract(sgX);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Test
+ public void testCoshAcosh() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient rebuiltX = sgX.cosh().acosh();
+ SparseGradient zero = rebuiltX.subtract(sgX);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Test
+ public void testTanhAtanh() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient rebuiltX = sgX.tanh().atanh();
+ SparseGradient zero = rebuiltX.subtract(sgX);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Test
+ public void testCompositionOneVariableY() {
+ for (double x = 0.1; x < 1.2; x += 0.1) {
+ SparseGradient sgX = SparseGradient.createConstant(x);
+ for (double y = 0.1; y < 1.2; y += 0.1) {
+ SparseGradient sgY = SparseGradient.createVariable(0, y);
+ SparseGradient f = sgX.divide(sgY).sqrt();
+ double f0 = FastMath.sqrt(x / y);
+ double f1 = -x / (2 * y * y * f0);
+ checkF0F1(f, f0, f1);
+ }
+ }
+ }
+
+ @Test
+ public void testTaylorPolynomial() {
+ for (double x = 0; x < 1.2; x += 0.1) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ for (double y = 0; y < 1.2; y += 0.2) {
+ SparseGradient sgY = SparseGradient.createVariable(1, y);
+ for (double z = 0; z < 1.2; z += 0.2) {
+ SparseGradient sgZ = SparseGradient.createVariable(2, z);
+ SparseGradient f = sgX.multiply(3).add(sgZ.multiply(-2)).add(sgY.multiply(5));
+ for (double dx = -0.2; dx < 0.2; dx += 0.2) {
+ for (double dy = -0.2; dy < 0.2; dy += 0.1) {
+ for (double dz = -0.2; dz < 0.2; dz += 0.1) {
+ double ref = 3 * (x + dx) + 5 * (y + dy) -2 * (z + dz);
+ Assert.assertEquals(ref, f.taylor(dx, dy, dz), 3.0e-15);
+ }
+ }
+ }
+ }
+ }
+ }
+ }
+
+ @Test
+ public void testTaylorAtan2() {
+ double x0 = 0.1;
+ double y0 = -0.3;
+ SparseGradient sgX = SparseGradient.createVariable(0, x0);
+ SparseGradient sgY = SparseGradient.createVariable(1, y0);
+ SparseGradient atan2 = SparseGradient.atan2(sgY, sgX);
+ double maxError = 0;
+ for (double dx = -0.05; dx < 0.05; dx += 0.001) {
+ for (double dy = -0.05; dy < 0.05; dy += 0.001) {
+ double ref = FastMath.atan2(y0 + dy, x0 + dx);
+ maxError = FastMath.max(maxError, FastMath.abs(ref - atan2.taylor(dx, dy)));
+ }
+ }
+ double expectedError = 0.0241;
+ Assert.assertEquals(expectedError, maxError, 0.01 * expectedError);
+ }
+
+ @Override
+ @Test
+ public void testAbs() {
+
+ SparseGradient minusOne = SparseGradient.createVariable(0, -1.0);
+ Assert.assertEquals(+1.0, minusOne.abs().getValue(), 1.0e-15);
+ Assert.assertEquals(-1.0, minusOne.abs().getDerivative(0), 1.0e-15);
+
+ SparseGradient plusOne = SparseGradient.createVariable(0, +1.0);
+ Assert.assertEquals(+1.0, plusOne.abs().getValue(), 1.0e-15);
+ Assert.assertEquals(+1.0, plusOne.abs().getDerivative(0), 1.0e-15);
+
+ SparseGradient minusZero = SparseGradient.createVariable(0, -0.0);
+ Assert.assertEquals(+0.0, minusZero.abs().getValue(), 1.0e-15);
+ Assert.assertEquals(-1.0, minusZero.abs().getDerivative(0), 1.0e-15);
+
+ SparseGradient plusZero = SparseGradient.createVariable(0, +0.0);
+ Assert.assertEquals(+0.0, plusZero.abs().getValue(), 1.0e-15);
+ Assert.assertEquals(+1.0, plusZero.abs().getDerivative(0), 1.0e-15);
+
+ }
+
+ @Override
+ @Test
+ public void testSignum() {
+
+ SparseGradient minusOne = SparseGradient.createVariable(0, -1.0);
+ Assert.assertEquals(-1.0, minusOne.signum().getValue(), 1.0e-15);
+ Assert.assertEquals( 0.0, minusOne.signum().getDerivative(0), 1.0e-15);
+
+ SparseGradient plusOne = SparseGradient.createVariable(0, +1.0);
+ Assert.assertEquals(+1.0, plusOne.signum().getValue(), 1.0e-15);
+ Assert.assertEquals( 0.0, plusOne.signum().getDerivative(0), 1.0e-15);
+
+ SparseGradient minusZero = SparseGradient.createVariable(0, -0.0);
+ Assert.assertEquals(-0.0, minusZero.signum().getValue(), 1.0e-15);
+ Assert.assertTrue(Double.doubleToLongBits(minusZero.signum().getValue()) < 0);
+ Assert.assertEquals( 0.0, minusZero.signum().getDerivative(0), 1.0e-15);
+
+ SparseGradient plusZero = SparseGradient.createVariable(0, +0.0);
+ Assert.assertEquals(+0.0, plusZero.signum().getValue(), 1.0e-15);
+ Assert.assertTrue(Double.doubleToLongBits(plusZero.signum().getValue()) == 0);
+ Assert.assertEquals( 0.0, plusZero.signum().getDerivative(0), 1.0e-15);
+
+ }
+
+ @Test
+ public void testCeilFloorRintLong() {
+
+ SparseGradient x = SparseGradient.createVariable(0, -1.5);
+ Assert.assertEquals(-1.5, x.getValue(), 1.0e-15);
+ Assert.assertEquals(+1.0, x.getDerivative(0), 1.0e-15);
+ Assert.assertEquals(-1.0, x.ceil().getValue(), 1.0e-15);
+ Assert.assertEquals(+0.0, x.ceil().getDerivative(0), 1.0e-15);
+ Assert.assertEquals(-2.0, x.floor().getValue(), 1.0e-15);
+ Assert.assertEquals(+0.0, x.floor().getDerivative(0), 1.0e-15);
+ Assert.assertEquals(-2.0, x.rint().getValue(), 1.0e-15);
+ Assert.assertEquals(+0.0, x.rint().getDerivative(0), 1.0e-15);
+ Assert.assertEquals(-2.0, x.subtract(x.getField().getOne()).rint().getValue(), 1.0e-15);
+ Assert.assertEquals(-1l, x.round(), 1.0e-15);
+
+ }
+
+ @Test
+ public void testCopySign() {
+
+ SparseGradient minusOne = SparseGradient.createVariable(0, -1.0);
+ Assert.assertEquals(+1.0, minusOne.copySign(+1.0).getValue(), 1.0e-15);
+ Assert.assertEquals(-1.0, minusOne.copySign(+1.0).getDerivative(0), 1.0e-15);
+ Assert.assertEquals(-1.0, minusOne.copySign(-1.0).getValue(), 1.0e-15);
+ Assert.assertEquals(+1.0, minusOne.copySign(-1.0).getDerivative(0), 1.0e-15);
+ Assert.assertEquals(+1.0, minusOne.copySign(+0.0).getValue(), 1.0e-15);
+ Assert.assertEquals(-1.0, minusOne.copySign(+0.0).getDerivative(0), 1.0e-15);
+ Assert.assertEquals(-1.0, minusOne.copySign(-0.0).getValue(), 1.0e-15);
+ Assert.assertEquals(+1.0, minusOne.copySign(-0.0).getDerivative(0), 1.0e-15);
+ Assert.assertEquals(+1.0, minusOne.copySign(Double.NaN).getValue(), 1.0e-15);
+ Assert.assertEquals(-1.0, minusOne.copySign(Double.NaN).getDerivative(0), 1.0e-15);
+
+ SparseGradient plusOne = SparseGradient.createVariable(0, +1.0);
+ Assert.assertEquals(+1.0, plusOne.copySign(+1.0).getValue(), 1.0e-15);
+ Assert.assertEquals(+1.0, plusOne.copySign(+1.0).getDerivative(0), 1.0e-15);
+ Assert.assertEquals(-1.0, plusOne.copySign(-1.0).getValue(), 1.0e-15);
+ Assert.assertEquals(-1.0, plusOne.copySign(-1.0).getDerivative(0), 1.0e-15);
+ Assert.assertEquals(+1.0, plusOne.copySign(+0.0).getValue(), 1.0e-15);
+ Assert.assertEquals(+1.0, plusOne.copySign(+0.0).getDerivative(0), 1.0e-15);
+ Assert.assertEquals(-1.0, plusOne.copySign(-0.0).getValue(), 1.0e-15);
+ Assert.assertEquals(-1.0, plusOne.copySign(-0.0).getDerivative(0), 1.0e-15);
+ Assert.assertEquals(+1.0, plusOne.copySign(Double.NaN).getValue(), 1.0e-15);
+ Assert.assertEquals(+1.0, plusOne.copySign(Double.NaN).getDerivative(0), 1.0e-15);
+
+ }
+
+ @Test
+ public void testToDegreesDefinition() {
+ double epsilon = 3.0e-16;
+ for (int maxOrder = 0; maxOrder < 6; ++maxOrder) {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ Assert.assertEquals(FastMath.toDegrees(x), sgX.toDegrees().getValue(), epsilon);
+ Assert.assertEquals(180 / FastMath.PI, sgX.toDegrees().getDerivative(0), epsilon);
+ }
+ }
+ }
+
+ @Test
+ public void testToRadiansDefinition() {
+ double epsilon = 3.0e-16;
+ for (int maxOrder = 0; maxOrder < 6; ++maxOrder) {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ Assert.assertEquals(FastMath.toRadians(x), sgX.toRadians().getValue(), epsilon);
+ Assert.assertEquals(FastMath.PI / 180, sgX.toRadians().getDerivative(0), epsilon);
+ }
+ }
+ }
+
+ @Test
+ public void testDegRad() {
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient rebuiltX = sgX.toDegrees().toRadians();
+ SparseGradient zero = rebuiltX.subtract(sgX);
+ checkF0F1(zero, 0, 0);
+ }
+ }
+
+ @Test
+ public void testCompose() {
+ PolynomialFunction poly =
+ new PolynomialFunction(new double[] { 1.0, 2.0, 3.0, 4.0, 5.0, 6.0 });
+ for (double x = 0.1; x < 1.2; x += 0.001) {
+ SparseGradient sgX = SparseGradient.createVariable(0, x);
+ SparseGradient sgY1 = sgX.getField().getZero();
+ for (int i = poly.degree(); i >= 0; --i) {
+ sgY1 = sgY1.multiply(sgX).add(poly.getCoefficients()[i]);
+ }
+ SparseGradient sgY2 = sgX.compose(poly.value(x), poly.derivative().value(x));
+ SparseGradient zero = sgY1.subtract(sgY2);
+ checkF0F1(zero, 0.0, 0.0);
+ }
+ }
+
+ @Test
+ public void testField() {
+ SparseGradient x = SparseGradient.createVariable(0, 1.0);
+ checkF0F1(x.getField().getZero(), 0.0, 0.0, 0.0, 0.0);
+ checkF0F1(x.getField().getOne(), 1.0, 0.0, 0.0, 0.0);
+ Assert.assertEquals(SparseGradient.class, x.getField().getRuntimeClass());
+ }
+
+ @Test
+ public void testLinearCombination1DSDS() {
+ final SparseGradient[] a = new SparseGradient[] {
+ SparseGradient.createVariable(0, -1321008684645961.0 / 268435456.0),
+ SparseGradient.createVariable(1, -5774608829631843.0 / 268435456.0),
+ SparseGradient.createVariable(2, -7645843051051357.0 / 8589934592.0)
+ };
+ final SparseGradient[] b = new SparseGradient[] {
+ SparseGradient.createVariable(3, -5712344449280879.0 / 2097152.0),
+ SparseGradient.createVariable(4, -4550117129121957.0 / 2097152.0),
+ SparseGradient.createVariable(5, 8846951984510141.0 / 131072.0)
+ };
+
+ final SparseGradient abSumInline = a[0].linearCombination(a[0], b[0], a[1], b[1], a[2], b[2]);
+ final SparseGradient abSumArray = a[0].linearCombination(a, b);
+
+ Assert.assertEquals(abSumInline.getValue(), abSumArray.getValue(), 1.0e-15);
+ Assert.assertEquals(-1.8551294182586248737720779899, abSumInline.getValue(), 1.0e-15);
+ Assert.assertEquals(b[0].getValue(), abSumInline.getDerivative(0), 1.0e-15);
+ Assert.assertEquals(b[1].getValue(), abSumInline.getDerivative(1), 1.0e-15);
+ Assert.assertEquals(b[2].getValue(), abSumInline.getDerivative(2), 1.0e-15);
+ Assert.assertEquals(a[0].getValue(), abSumInline.getDerivative(3), 1.0e-15);
+ Assert.assertEquals(a[1].getValue(), abSumInline.getDerivative(4), 1.0e-15);
+ Assert.assertEquals(a[2].getValue(), abSumInline.getDerivative(5), 1.0e-15);
+
+ }
+
+ @Test
+ public void testLinearCombination1DoubleDS() {
+ final double[] a = new double[] {
+ -1321008684645961.0 / 268435456.0,
+ -5774608829631843.0 / 268435456.0,
+ -7645843051051357.0 / 8589934592.0
+ };
+ final SparseGradient[] b = new SparseGradient[] {
+ SparseGradient.createVariable(0, -5712344449280879.0 / 2097152.0),
+ SparseGradient.createVariable(1, -4550117129121957.0 / 2097152.0),
+ SparseGradient.createVariable(2, 8846951984510141.0 / 131072.0)
+ };
+
+ final SparseGradient abSumInline = b[0].linearCombination(a[0], b[0],
+ a[1], b[1],
+ a[2], b[2]);
+ final SparseGradient abSumArray = b[0].linearCombination(a, b);
+
+ Assert.assertEquals(abSumInline.getValue(), abSumArray.getValue(), 1.0e-15);
+ Assert.assertEquals(-1.8551294182586248737720779899, abSumInline.getValue(), 1.0e-15);
+ Assert.assertEquals(a[0], abSumInline.getDerivative(0), 1.0e-15);
+ Assert.assertEquals(a[1], abSumInline.getDerivative(1), 1.0e-15);
+ Assert.assertEquals(a[2], abSumInline.getDerivative(2), 1.0e-15);
+
+ }
+
+ @Test
+ public void testLinearCombination2DSDS() {
+ // we compare accurate versus naive dot product implementations
+ // on regular vectors (i.e. not extreme cases like in the previous test)
+ Well1024a random = new Well1024a(0xc6af886975069f11l);
+
+ for (int i = 0; i < 10000; ++i) {
+ final SparseGradient[] u = new SparseGradient[4];
+ final SparseGradient[] v = new SparseGradient[4];
+ for (int j = 0; j < u.length; ++j) {
+ u[j] = SparseGradient.createVariable(j, 1e17 * random.nextDouble());
+ v[j] = SparseGradient.createConstant(1e17 * random.nextDouble());
+ }
+
+ SparseGradient lin = u[0].linearCombination(u[0], v[0], u[1], v[1]);
+ double ref = u[0].getValue() * v[0].getValue() +
+ u[1].getValue() * v[1].getValue();
+ Assert.assertEquals(ref, lin.getValue(), 1.0e-15 * FastMath.abs(ref));
+ Assert.assertEquals(v[0].getValue(), lin.getDerivative(0), 1.0e-15 * FastMath.abs(v[0].getValue()));
+ Assert.assertEquals(v[1].getValue(), lin.getDerivative(1), 1.0e-15 * FastMath.abs(v[1].getValue()));
+
+ lin = u[0].linearCombination(u[0], v[0], u[1], v[1], u[2], v[2]);
+ ref = u[0].getValue() * v[0].getValue() +
+ u[1].getValue() * v[1].getValue() +
+ u[2].getValue() * v[2].getValue();
+ Assert.assertEquals(ref, lin.getValue(), 1.0e-15 * FastMath.abs(ref));
+ Assert.assertEquals(v[0].getValue(), lin.getDerivative(0), 1.0e-15 * FastMath.abs(v[0].getValue()));
+ Assert.assertEquals(v[1].getValue(), lin.getDerivative(1), 1.0e-15 * FastMath.abs(v[1].getValue()));
+ Assert.assertEquals(v[2].getValue(), lin.getDerivative(2), 1.0e-15 * FastMath.abs(v[2].getValue()));
+
+ lin = u[0].linearCombination(u[0], v[0], u[1], v[1], u[2], v[2], u[3], v[3]);
+ ref = u[0].getValue() * v[0].getValue() +
+ u[1].getValue() * v[1].getValue() +
+ u[2].getValue() * v[2].getValue() +
+ u[3].getValue() * v[3].getValue();
+ Assert.assertEquals(ref, lin.getValue(), 1.0e-15 * FastMath.abs(ref));
+ Assert.assertEquals(v[0].getValue(), lin.getDerivative(0), 1.0e-15 * FastMath.abs(v[0].getValue()));
+ Assert.assertEquals(v[1].getValue(), lin.getDerivative(1), 1.0e-15 * FastMath.abs(v[1].getValue()));
+ Assert.assertEquals(v[2].getValue(), lin.getDerivative(2), 1.0e-15 * FastMath.abs(v[2].getValue()));
+ Assert.assertEquals(v[3].getValue(), lin.getDerivative(3), 1.0e-15 * FastMath.abs(v[3].getValue()));
+
+ }
+ }
+
+ @Test
+ public void testLinearCombination2DoubleDS() {
+ // we compare accurate versus naive dot product implementations
+ // on regular vectors (i.e. not extreme cases like in the previous test)
+ Well1024a random = new Well1024a(0xc6af886975069f11l);
+
+ for (int i = 0; i < 10000; ++i) {
+ final double[] u = new double[4];
+ final SparseGradient[] v = new SparseGradient[4];
+ for (int j = 0; j < u.length; ++j) {
+ u[j] = 1e17 * random.nextDouble();
+ v[j] = SparseGradient.createVariable(j, 1e17 * random.nextDouble());
+ }
+
+ SparseGradient lin = v[0].linearCombination(u[0], v[0], u[1], v[1]);
+ double ref = u[0] * v[0].getValue() +
+ u[1] * v[1].getValue();
+ Assert.assertEquals(ref, lin.getValue(), 1.0e-15 * FastMath.abs(ref));
+ Assert.assertEquals(u[0], lin.getDerivative(0), 1.0e-15 * FastMath.abs(v[0].getValue()));
+ Assert.assertEquals(u[1], lin.getDerivative(1), 1.0e-15 * FastMath.abs(v[1].getValue()));
+
+ lin = v[0].linearCombination(u[0], v[0], u[1], v[1], u[2], v[2]);
+ ref = u[0] * v[0].getValue() +
+ u[1] * v[1].getValue() +
+ u[2] * v[2].getValue();
+ Assert.assertEquals(ref, lin.getValue(), 1.0e-15 * FastMath.abs(ref));
+ Assert.assertEquals(u[0], lin.getDerivative(0), 1.0e-15 * FastMath.abs(v[0].getValue()));
+ Assert.assertEquals(u[1], lin.getDerivative(1), 1.0e-15 * FastMath.abs(v[1].getValue()));
+ Assert.assertEquals(u[2], lin.getDerivative(2), 1.0e-15 * FastMath.abs(v[2].getValue()));
+
+ lin = v[0].linearCombination(u[0], v[0], u[1], v[1], u[2], v[2], u[3], v[3]);
+ ref = u[0] * v[0].getValue() +
+ u[1] * v[1].getValue() +
+ u[2] * v[2].getValue() +
+ u[3] * v[3].getValue();
+ Assert.assertEquals(ref, lin.getValue(), 1.0e-15 * FastMath.abs(ref));
+ Assert.assertEquals(u[0], lin.getDerivative(0), 1.0e-15 * FastMath.abs(v[0].getValue()));
+ Assert.assertEquals(u[1], lin.getDerivative(1), 1.0e-15 * FastMath.abs(v[1].getValue()));
+ Assert.assertEquals(u[2], lin.getDerivative(2), 1.0e-15 * FastMath.abs(v[2].getValue()));
+ Assert.assertEquals(u[3], lin.getDerivative(3), 1.0e-15 * FastMath.abs(v[3].getValue()));
+
+ }
+ }
+
+ @Test
+ public void testSerialization() {
+ SparseGradient a = SparseGradient.createVariable(0, 1.3);
+ SparseGradient b = (SparseGradient) TestUtils.serializeAndRecover(a);
+ Assert.assertEquals(a, b);
+ }
+
+ private void checkF0F1(SparseGradient sg, double value, double...derivatives) {
+
+ // check value
+ Assert.assertEquals(value, sg.getValue(), 1.0e-13);
+
+ // check first order derivatives
+ for (int i = 0; i < derivatives.length; ++i) {
+ Assert.assertEquals(derivatives[i], sg.getDerivative(i), 1.0e-13);
+ }
+
+ }
+
+}
Propchange: commons/proper/math/trunk/src/test/java/org/apache/commons/math3/analysis/differentiation/SparseGradientTest.java
------------------------------------------------------------------------------
svn:eol-style = native
Propchange: commons/proper/math/trunk/src/test/java/org/apache/commons/math3/analysis/differentiation/SparseGradientTest.java
------------------------------------------------------------------------------
svn:keywords = "Author Date Id Revision"
Re: svn commit: r1536073 - in /commons/proper/math/trunk/src:
changes/changes.xml main/java/org/apache/commons/math3/analysis/differentiation/SparseGradient.java
test/java/org/apache/commons/math3/analysis/differentiation/SparseGradientTest.java
Posted by Sean Owen <sr...@gmail.com>.
On Sun, Oct 27, 2013 at 10:41 AM, Luc Maisonobe <lu...@spaceroots.org> wrote:
>>> + return 743 + 809 * + 233 * MathUtils.hash(value) + 167 *
>> ^^^^^^
>> Typo?
>
> Yes. I don't even understand why this code does not raise a compilation
> error ...
>
> Thanks for spotting this. Everything is fixed in the subversion
> repository now.
The "+" becomes a unary plus operator. It's read as "809 * (+233) *
MathUtils.hash..."
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org
Re: svn commit: r1536073 - in /commons/proper/math/trunk/src: changes/changes.xml
main/java/org/apache/commons/math3/analysis/differentiation/SparseGradient.java
test/java/org/apache/commons/math3/analysis/differentiation/SparseGradientTest.java
Posted by Luc Maisonobe <lu...@spaceroots.org>.
Le 27/10/2013 14:34, Gilles a écrit :
> On Sun, 27 Oct 2013 09:52:44 -0000, luc@apache.org wrote:
>> Author: luc
>> Date: Sun Oct 27 09:52:44 2013
>> New Revision: 1536073
>>
>> URL: http://svn.apache.org/r1536073
>> Log:
>> Added SparseGradient to deal efficiently with numerous variables.
>>
>> [...]
>> +
>> + /**
>> + * Multiply in place.
>> + * <p>
>> + * This method is designed to be faster when used multiple times
>> in a loop.
>> + * </p>
>> + * <p>
>> + * The instance is changed here, in order to not change the
>> + * instance the {@link #add(SparseGradient)} method should
>> + * be used.
>> + * </p>
>> + * @param a instance to multiply
>> + */
>> + public void multInPlace(final SparseGradient a) {
> ^^^^^^^^^^^
>
> Shouldn't the name composed of full words (i.e be "multiplyInPlace")?
The original name was the one from the patch, but full words are better.
I have fixed this.
>
>> [...]
>> +
>> + /**
>> + * Get a hashCode for the derivative structure.
>> + * @return a hash code value for this object
>> + * @since 3.2
>> + */
>> + @Override
>> + public int hashCode() {
>> + return 743 + 809 * + 233 * MathUtils.hash(value) + 167 *
> ^^^^^^
> Typo?
Yes. I don't even understand why this code does not raise a compilation
error ...
Thanks for spotting this. Everything is fixed in the subversion
repository now.
best regards,
Luc
>
>> derivatives.hashCode();
>> + }
>> +
>> +}
>>
>
> Regards,
> Gilles
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
> For additional commands, e-mail: dev-help@commons.apache.org
>
>
>
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org
Re: svn commit: r1536073 - in /commons/proper/math/trunk/src: changes/changes.xml main/java/org/apache/commons/math3/analysis/differentiation/SparseGradient.java test/java/org/apache/commons/math3/analysis/differentiation/SparseGradientTest.java
Posted by Gilles <gi...@harfang.homelinux.org>.
On Sun, 27 Oct 2013 09:52:44 -0000, luc@apache.org wrote:
> Author: luc
> Date: Sun Oct 27 09:52:44 2013
> New Revision: 1536073
>
> URL: http://svn.apache.org/r1536073
> Log:
> Added SparseGradient to deal efficiently with numerous variables.
>
> [...]
> +
> + /**
> + * Multiply in place.
> + * <p>
> + * This method is designed to be faster when used multiple times
> in a loop.
> + * </p>
> + * <p>
> + * The instance is changed here, in order to not change the
> + * instance the {@link #add(SparseGradient)} method should
> + * be used.
> + * </p>
> + * @param a instance to multiply
> + */
> + public void multInPlace(final SparseGradient a) {
^^^^^^^^^^^
Shouldn't the name composed of full words (i.e be "multiplyInPlace")?
> [...]
> +
> + /**
> + * Get a hashCode for the derivative structure.
> + * @return a hash code value for this object
> + * @since 3.2
> + */
> + @Override
> + public int hashCode() {
> + return 743 + 809 * + 233 * MathUtils.hash(value) + 167 *
^^^^^^
Typo?
> derivatives.hashCode();
> + }
> +
> +}
>
Regards,
Gilles
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org