Jacobian Matrix. The first-order derivative of a continuous and differentiable function F (of multiple parameters) is sometimes called the Jacobian matrix J of F (at some specific values of parameter vector x). The Jacobian matrix plays an important role in most computational algorithms for estimating parameter values for nonlinear regression problems, in particular in the Gauss-Newton and Levenberg-Marquardt algorithms; see also Nonlinear Estimation for details.
Jogging Weights. Adding a small random amount to the weights in a neural network, in an attempt to escape a local optima in error space. See also, Neural Networks.
Johnson Curves. Johnson (1949) described a system of frequency curves that represents transformations of the standard normal curve (see Hahn and Shapiro, 1967, for details). By applying these transformations to a standard normal variable, a wide variety of non-normal distributions can be approximated, including distributions which are bounded on either one or both sides (e.g., U- shaped distributions). The advantage of this approach is that once a particular Johnson curve has been fit, the normal integral can be used to compute the expected percentage points under the respective curve. Methods for fitting Johnson curves, so as to approximate the first four moments of an empirical distribution, are described in detail in Hahn and Shapiro, 1967, pages 199-220; and Hill, Hill, and Holder, 1976. See also, Pearson Curves.
Join. A join shows how data is related between two tables. When two tables contain matching values on a field, records from the two tables can be combined by defining a Join. For example, suppose one table has the weight of objects with their associated part number and another table has part numbers and their associated product names. A join specifies that the two part number fields are equivalent and allows weights and product names to be related.
Joining Networks (in Neural Networks). It is sometimes useful to be able to join two networks together to form a single composite network for a number of reasons:
You might train one network to do preprocessing of data, and another to further classify the preprocessed data. Once completed, the networks can be joined together to classify raw data.
You might want to add a loss matrix to a classification network, to make minimum cost decisions.
Note: Networks can only be joined if the number of input neurons in the second network matches the number of output neurons in the first network. The input neurons from the second network are discarded, and their fan-out weights are attached to the output neurons of the first network.
Caution: The post-processing information from the first network and the input preprocessing information from the second network are also discarded. The composite network is unlikely to make sense unless you have designed the two networks with this in mind; i.e., with no post-processing performed by the first network and no preprocessing performed by the second network.
JPEG. Acronym for Joint Photographic Experts Group. An ISO/ITU standard for storing images in compressed form using a discrete cosine transform.
JPG. A file name extension used to save JPEG documents (see JPEG).