Examples of linear function in the following topics:

 A zero, or $x$intercept, is the point at which a linear function's value will equal zero.
 The graph of a linear function is a straight line.
 Linear functions can have none, one, or infinitely many zeros.
 To find the zero of a linear function algebraically, set $y=0$ and solve for $x$.
 The zero from solving the linear function above graphically must match solving the same function algebraically.

 Linear and quadratic functions make lines and a parabola, respectively, when graphed and are some of the simplest functional forms.
 Linear and quadratic functions make lines and parabola, respectively, when graphed.
 In calculus and algebra, the term linear function refers to a function that satisfies the following two linearity properties:
 Linear functions may be confused with affine functions.
 Linear functions form the basis of linear algebra.

 Linear functions are algebraic equations whose graphs are straight lines with unique values for their slope and yintercepts.
 A linear function is an algebraic equation in which each term is either a constant or the product of a constant and (the first power of) a single variable.
 It is linear: the exponent of the $x$ term is a one (first power), and it follows the definition of a function: for each input ($x$) there is exactly one output ($y$).
 The blue line, $y=\frac{1}{2}x3$ and the red line, $y=x+5$ are both linear functions.
 Identify what makes a function linear and the characteristics of a linear function

 A linear approximation is an approximation of a general function using a linear function.
 In mathematics, a linear approximation is an approximation of a general function using a linear function (more precisely, an affine function).
 Linear approximation is achieved by using Taylor's theorem to approximate the value of a function at a point.
 If one were to take an infinitesimally small step size for $a$, the linear approximation would exactly match the function.
 Linear approximations for vector functions of a vector variable are obtained in the same way, with the derivative at a point replaced by the Jacobian matrix.

 A quadratic equation is a specific case of a quadratic function, with the function set equal to zero:
 Quadratic equations are different than linear functions in a few key ways.
 Linear functions either always decrease (if they have negative slope) or always increase (if they have positive slope).
 With a linear function, each input has an individual, unique output (assuming the output is not a constant).
 The slope of a quadratic function, unlike the slope of a linear function, is constantly changing.

 A secondorder linear differential equation has the form $\frac{d^2 y}{dt^2} + A_1(t)\frac{dy}{dt} + A_2(t)y = f(t)$, where $A_1(t)$, $A_2(t)$, and $f(t)$ are continuous functions.
 Linear differential equations are of the form $Ly = f$, where the differential operator $L$ is a linear operator, $y$ is the unknown function (such as a function of time $y(t)$), and the right hand side $f$ is a given function of the same nature as $y$ (called the source term).
 The linear operator $L$ may be considered to be of the form:
 where $A_1(t)$, $A_2(t)$, and $f(t)$ are continuous functions.
 When $f(t)=0$, the equations are called homogeneous secondorder linear differential equations.

 A common form of a linear equation in the two variables $x$ and $y$ is:
 Since terms of linear equations cannot contain products of distinct or equal variables, nor any power (other than $1$) or other function of a variable, equations involving terms such as $xy$, $x^2$, $y^{\frac{1}{3}}$, and $\sin x$ are nonlinear.
 Linear differential equations are of the form:
 where the differential operator $L$ is a linear operator, $y$ is the unknown function (such as a function of time $y(t)$), and $f$ is a given function of the same nature as y (called the source term).
 For a function dependent on time, we may write the equation more expressly as:

 In statistics, linear regression can be used to fit a predictive model to an observed data set of $y$ and $x$ values.
 In statistics, simple linear regression is the least squares estimator of a linear regression model with a single explanatory variable.
 Linear regression was the first type of regression analysis to be studied rigorously, and to be used extensively in practical applications.
 A common form of a linear equation in the two variables $x$ and $y$ is:
 The origin of the name "linear" comes from the fact that the set of solutions of such an equation forms a straight line in the plane.

 Linear equations are those with one or more variables of the first order.
 There is in fact a field of mathematics known as linear algebra, in which linear equations in up to an infinite number of variables are studied.
 Linear equations can therefore be expressed in general (standard) form as:
 If the drivers want to designate a meeting point, they can algebraically find the point of intersection of the two functions, as seen in .
 If the drivers want to designate a meeting point, they can algebraically find the point of intersection of the two functions.

 In the previous atom, we learned that a secondorder linear differential equation has the form:
 where $A_1(t)$, $A_2(t)$, and $f(t)$ are continuous functions.
 When $f(t)=0$, the equations are called homogeneous secondorder linear differential equations.
 Phenomena such as heat transfer can be described using nonhomogeneous secondorder linear differential equations.
 Identify when a secondorder linear differential equation can be solved analytically