# Curves that would pass the vertical line test but cannot be represented by a function

Status
Not open for further replies.

#### jasonbe

##### New Member
Are there any curves that would pass the vertical line test but cannot be represented by a function?

Sorry but that questing doesn't make any sense unless it's put into context.

Are there any curves that would pass the vertical line test but cannot be represented by a function?

Hi,

The vertical line test is a pretty rigorous test in theory in that the
line would be infinitesimally thin, and that if on that line you could
find two or more points that the function generates then that
function can not be a 'true' function.
Of course in real life we often have to deal with multi valued functions
anyway and recognize when the function has to be broken up, and
that brings up other qualifying definitions of functions like,
real functions, differentiable functions, etc.
Sometimes we have to break the function up into two parts and
find the part that works for our application in the real world.
For example, if we had a circular function that generated a circle
we could get two solutions, one positive and one negative, and
we would have to test which one works in our real application.
Sometimes the solution is obvious though, such as if we calculate
the value of a resistor...if we get a negative and positive value we usually
assume it's the positive one.

This is really another topic but...
One simple function that is interesting is the y=int(x) function. We can
always calculate y knowing x, but we can not calculate x knowing y
because we get the same value of y for many many values of x.

The vertical line test is used to test a function that already exists, so first
you have the function at hand, and then you test it. If it passes you assume
the function might be a function, but if it fails you know right then and there
that it is not a function.
Im sure there are functions that will pass the test but will not be differentiable,
so they could not be considered differentiable functions for example. By the
definition, "one value of y for any given x from -inf to +inf", they may be considered
functions though.

Last edited:
Hi,

The vertical line test is a pretty rigorous test in theory in that the
line would be infinitesimally thin, and that if on that line you could
find two or more points that the function generates then that
function can not be a 'true' function.
Of course in real life we often have to deal with multi valued functions
anyway and recognize when the function has to be broken up, and
that brings up other qualifying definitions of functions like,
real functions, differentiable functions, etc.
Sometimes we have to break the function up into two parts and
find the part that works for our application in the real world.
For example, if we had a circular function that generated a circle
we could get two solutions, one positive and one negative, and
we would have to test which one works in our real application.
Sometimes the solution is obvious though, such as if we calculate
the value of a resistor...if we get a negative and positive value we usually
assume it's the positive one.

This is really another topic but...
One simple function that is interesting is the y=int(x) function. We can
always calculate y knowing x, but we can not calculate x knowing y
because we get the same value of y for many many values of x.

The vertical line test is used to test a function that already exists, so first
you have the function at hand, and then you test it. If it passes you assume
the function might be a function, but if it fails you know right then and there
that it is not a function.
Im sure there are functions that will pass the test but will not be differentiable,
so they could not be considered differentiable functions for example. By the
definition, "one value of y for any given x from -inf to +inf", they may be considered
functions though.

I've heard that y = x^2 defines a locus of points that are the ends of line segments whose difference of lengths is equal to a constant, all of half of the line segments having an end coincident with a point, and all of the other half of the line segments having an end forming a perpendicular to a line. If I take the differential of this I get 2x. Is it possible to define a geometric relationship in a similar way between x^2 and 2x? What is a geometric description of y = x^3? Can it be described in terms of the sum or difference of line segments between two shapes?

Last edited:
I've heard that y = x^2 defines a locus of points that are the ends of line segments whose difference of lengths is equal to a constant, .............

Hi,

Im not sure what you are saying here, because if the difference was a
constant then all the lines would be the same length. Are you talking

Im not sure if this helps or not, but...
If x is interpreted as a length along some dimension, then
x^2 is an area, and x^3 is a volume. Thus, integration or
differentiation takes us up one or down one dimension, respectively.

Last edited:
Jasonbe... do you by any chance mean this

#### Attachments

• parabola.jpg
14 KB · Views: 991
Hi,

I was going to ask him to draw a picture just to describe this.
Maybe that is it?

yeah the only thing is i'm pretty sure what he said is that the lines were differing by a constant, so by that i assume that he was meaning that one line would be 10cm the next 9.99 next 9.98... etc. If these lines were vertical and there were an infinite number of lines this could be what he was referring to maybe.... i'm not sure

I'm attaching a picture to this thread that I am hoping will communicate what I meant by subtracting the lengths of line segments. I understand how differentiation and integration can be used to change dimensions - and I think that this may involve a process that can geometrically be explained with tangent lines. I am interested in other types of lines that may be able to describe shapes in addition to tangent lines - and that is why I posted the question to which the picture that I have attached is related. Areas and volumes are a little too much for me to determine what shapes they might be related to. I was interested in knowing if lines - that weren't tangent lines, could also be used to geometrically describe y = x^3 and perhaps other equations. The only thing left besides areas and shapes having more dimensions are lines and points, and I'd be interested in knowing how many shapes can be described in terms of lengths of lines segments related to lines and points. I've heard about conic sections - and I'd like to know if lines and points could be used to describe these shapes as well as functions that have more than one finite maximum and minimum. The only other picture that has been posted to this thread is interesting. Is there a name for the curve that it might resemble if more lines were drawn? Are tangent lines a common method that is used to define curves? Can pictures such as that one be used to describe statistical methods?

#### Attachments

Effectively the picture that i posted is a tangent picture, each straight line is the tangent to the curve at a certain point.

Is there a name for the curve that it might resemble if more lines were drawn?

It really is just a curve, it could be manipulated to resemble a half of a quadratic, a quarter of a circle, a portion of an elipse, representation of a logarithmic or exponential graph. you could even go as far to use it to represent half a truncus or hyperbola.

To achieve these graphs you would need to find a function for the tangent to one of the graphs at any point, then treat the lines as the axes lines and draw appropriate straight lines.

eg. for a section of a quadratic lets start with the simple f(x)= x^2 graph which I assume you know what it looks like due to the graph that you drew.

Find the gradient function (derivative) which is f'(x)=2x
you an then use the 'formula' to find the tangent at any point on the graph.
eg.

at the point (1,1)
sub x = 1 into gradient function to find the gradient of the tangent at this point.

f'(1) = 2

use the formula to find the equation of a straight to find the equation.

y - y1 = m(x - x1)

where m is the gradient found, y1 is the y coordinate of the point wanting to be found and x1 is the x coordinate

-> y - 1 = 2(x - 1)
y = 2x -1

if you want to transpose this tangent onto the graph to the desired, find the graph intercepts for the tangent and draw them in. (this will give you a reflection of the portion of the graph)

i suppose you could make a general formula similar to
y - y1 = f'(x1)(x - x1)

the pic a have attached shows rather poorly how you can manipulate the graphs

#### Attachments

I'm attaching a picture to this thread that I am hoping will communicate what I meant by subtracting the lengths of line segments. I understand how differentiation and integration can be used to change dimensions - and I think that this may involve a process that can geometrically be explained with tangent lines. I am interested in other types of lines that may be able to describe shapes in addition to tangent lines - and that is why I posted the question to which the picture that I have attached is related. Areas and volumes are a little too much for me to determine what shapes they might be related to. I was interested in knowing if lines - that weren't tangent lines, could also be used to geometrically describe y = x^3 and perhaps other equations. The only thing left besides areas and shapes having more dimensions are lines and points, and I'd be interested in knowing how many shapes can be described in terms of lengths of lines segments related to lines and points. I've heard about conic sections - and I'd like to know if lines and points could be used to describe these shapes as well as functions that have more than one finite maximum and minimum. The only other picture that has been posted to this thread is interesting. Is there a name for the curve that it might resemble if more lines were drawn? Are tangent lines a common method that is used to define curves? Can pictures such as that one be used to describe statistical methods?

Hi,

That 'constant difference' is the distance from the focus to the
curve and from the curve to a horizontal line for a curve like
y=A2*x^2+A1*x+A0
Given x1 and x2, the length from the focus to the curve and from
the curve to an arbitrary horizontal line that is far enough from the
focus (more or less) is the same. For example, with the curve
y=x^2 and line y=5, for x1=1 the sum of distances is 5.25, and with x2=2
the sum is still 5.25, and for x3=3 the difference is 5.25. Thus, the
distance is always the same.
You could take a look at Wikipedias entry for parabolas and that
would tell you more.

Are tangent lines a common method that is used to define curves?
Unless you want to get rid of all differential equations, this is always
going to be a *very* common way of defining curves In the pic provided below, the green dashes represent short straight
line segments that show a tiny portion of the slope of the curve
at that one point, what it would be if the curve we are after were
drawn through that one point. That entire set of green line segments
make up what is known as the "slope field", and of course that is
simply a field of slopes. We dont know what the curve is yet (ignore
that red dotted line for now) so we cant really draw the curve yet.
Once we get the constant though, we can home in on the correct
curve (see red dotted line) and that is the solution.
In other words, the equation:
dy/dx=something(x)
was plotted first (green line segments) and then later we are given the
constant and that's when we can draw the red line. Since dy/dx is
a slope, and the equation tells us lots of slopes, we plot the slopes
instead of points, then later find the total solution.

#### Attachments

Last edited:
I once had a math teacher who said something about finding a curve of best fit that I can best explain as typing data into a calculator and testing some of the predefined functions on the calculator to see which one looked to the best, or comparing computable errors, or something else - I don't remember. That seemed very practical to me - and at the time I didn't think that it was very scientific. After reconsidering, I would still like to know how scientists choose a curve of best fit. I've researched what theories, principles, postulates, laws and some other things are. I was able to understand some of them I think in a context that applies to this question. For example, I thought that some functions might be able to be made with only raw data, sometimes formulas describing deviations from known equations might also be able to be formulated from raw data - and maybe some of these known equations might have imprecise coefficients, or maybe an experiment involved calculating the coefficient. However, I've seen so many equations - most of which I didn't understand, that I wonder about their history. I might not be able to ask a complete question about most of these formulas. Still - given a raw set of data, how can a person identify what kind of formula would best represent the data. Maybe a better question would be, what do scientists look for once they have organized their data and they want an equation that represents it.

Last edited:
Hi again,

A good place to start is with the "least sum of squares" fit criterion.
With this technique, the data is run through the equation which generates
a 'y' value (called the 'predicted' value) and this value is subtracted
from the actual raw data 'y' value to generate an error 'err'. That error
is then squared, which is a good idea for a couple reasons. That squared
error is then summed with the other squared errors to form the 'sum of
squared errors' or simply the 'sum'. The coefficients are then varied
in an organized way until the least sum is found, and the coefficients that
resulted in that least sum are considered to be the best fit.
There are various techniques used to vary the coefficients in an orderly way,
including but not limited to what is called, "The method of steepest decent".
I think that would be a good place for you to start so you can get a feel for
what this is all about. Im sure if you look those terms up on the web all
sorts of articles will turn up. These topics fall under the general category of
"Numerical Analysis" and/or "Numerical Methods".

You could start with an equation that you are well familiar with, like
y=x^2+1 for example, then generate bogus data by adding a small random
sample to each y value to create say 10 values that are close to the
real value of y for each x. Then, use your new method to try to fit
y=k*x^2+1 to the data, and see what coefficient k you end up with for x
(it is ideally equal to 1, but assume you dont know that yet).
You could also experiment with varying the constant too, so you would start
with
y=k2*x^2+k1
When these fits are started it is sometimes a good idea to bound the constants
to something reasonable, depending on the application, such as -10<k<10 or
something like that.
Take a look on the web and see what you can find. It gets pretty interesting.

One of the most interesting techniques i have found in the past is the 'evolution'
fitting. This is probably the most extreme, where the idea is to set up a whole
group of equations that are made to evolve into a completely different equation,
and the one that gets the closest to the fit gets to 'live' to the next generation,
and the ones that are far off are killed off as soon as possible.
The beauty of this technique is that we dont have to know much about the equation
to begin with, but the drawback is it is very time consuming even on a fast computer
if it is the first time you ever tried to fit an equation to data...it also doesnt teach much

Last edited:
Hi again,

A good place to start is with the "least sum of squares" fit criterion.
With this technique, the data is run through the equation which generates
a 'y' value (called the 'predicted' value) and this value is subtracted
from the actual raw data 'y' value to generate an error 'err'. That error
is then squared, which is a good idea for a couple reasons. That squared
error is then summed with the other squared errors to form the 'sum of
squared errors' or simply the 'sum'. The coefficients are then varied
in an organized way until the least sum is found, and the coefficients that
resulted in that least sum are considered to be the best fit.
There are various techniques used to vary the coefficients in an orderly way,
including but not limited to what is called, "The method of steepest decent".
I think that would be a good place for you to start so you can get a feel for
what this is all about. Im sure if you look those terms up on the web all
sorts of articles will turn up. These topics fall under the general category of
"Numerical Analysis" and/or "Numerical Methods".

You could start with an equation that you are well familiar with, like
y=x^2+1 for example, then generate bogus data by adding a small random
sample to each y value to create say 10 values that are close to the
real value of y for each x. Then, use your new method to try to fit
y=k*x^2+1 to the data, and see what coefficient k you end up with for x
(it is ideally equal to 1, but assume you dont know that yet).
You could also experiment with varying the constant too, so you would start
with
y=k2*x^2+k1
When these fits are started it is sometimes a good idea to bound the constants
to something reasonable, depending on the application, such as -10<k<10 or
something like that.
Take a look on the web and see what you can find. It gets pretty interesting.

One of the most interesting techniques i have found in the past is the 'evolution'
fitting. This is probably the most extreme, where the idea is to set up a whole
group of equations that are made to evolve into a completely different equation,
and the one that gets the closest to the fit gets to 'live' to the next generation,
and the ones that are far off are killed off as soon as possible.
The beauty of this technique is that we dont have to know much about the equation
to begin with, but the drawback is it is very time consuming even on a fast computer
if it is the first time you ever tried to fit an equation to data...it also doesnt teach much

Is evolution fitting a nickname for something that I can look up on the web? I'm interested in learning how continuous and comprehensive evolution fitting is. I eventually plan on finding a math site at a University and looking for a layperson's description of the topics - if there areProxy-Connection: keep-alive
Cache-Control: max-age=0

oxy-Connection: keep-alive
Cache-Control: max-age=0

oxy-Connection: keep-alive
Cache-Control: max-age=0

uch descProxy-Connection: keep-alive
Cache-Control: max-age=0

ptions. I think that this may help learn more about curves and shapes of best fit. Any suggestions for finding such a descriptive list. How many of these topics are covered by evolution fitting?

Is evolution fitting a nickname for something that I can look up on the web? I'm interested in learning how continuous and comprehensive evolution fitting is. I eventually plan on finding a math site at a University and looking for a layperson's description of the topics - if there are such descriptions. I think that this may help learn more about curves and shapes of best fit. Any suggestions for finding such a descriptive list? How many of these topics are covered by evolution fitting?

Last edited:
Are there any curves that would pass the vertical line test but cannot be represented by a function?
The error function will certainly pass the vertical line test, but it is not an "elementary function". It is computable only as an approximation to a definite integral which means there is no way to compute exact values.

Look here for a more precise definition of "elementary function"
Elementary function - Wikipedia, the free encyclopedia
Look here for a more precise definition of "transcendental functions"
Transcendental function - Wikipedia, the free encyclopedia
Look here for a precise definition of "analytic functions"
Analytic function - Wikipedia, the free encyclopedia

The vertical line test seems pretty straight forward, I fail to understand all this woogie boogie discussion. It is simple X value has only one Y value. Anything else is not a function.

Is evolution fitting a nickname for something that I can look up on the web? I'm interested in learning how continuous and comprehensive evolution fitting is. I eventually plan on finding a math site at a University and looking for a layperson's description of the topics - if there are such descriptions. I think that this may help learn more about curves and shapes of best fit. Any suggestions for finding such a descriptive list? How many of these topics are covered by evolution fitting?

Hello,

If you remember, i suggested looking at the least sum of squares fitting
first. You should get a firm grasp of how this works before moving
to other types of fitting.
Evolutionary fitting (other names perhaps) is a way of experimentally
trying different functions in order to try to fit a set of functions to
a given set of data. It is different than other types of fitting because
i does not use a mathematically sound technique to find the best fit,
but rather mimics nature in trial and error similar to how evolution 'works'.
Try the least squares fitting first and see if that begins to make sense
to you, then i'll help you move to other types of fitting if you prefer.
I dont know any other names for it but you can ask around other sites.

Hello,

If you remember, i suggested looking at the least sum of squares fitting
first. You should get a firm grasp of how this works before moving
to other types of fitting.
Evolutionary fitting (other names perhaps) is a way of experimentally
trying different functions in order to try to fit a set of functions to
a given set of data. It is different than other types of fitting because
i does not use a mathematically sound technique to find the best fit,
but rather mimics nature in trial and error similar to how evolution 'works'.
Try the least squares fitting first and see if that begins to make sense
to you, then i'll help you move to other types of fitting if you prefer.
I dont know any other names for it but you can ask around other sites.

I started learning about the least sum of squares fitting. However - though you are entitled to, I can't think of any reason why anyone wouldn't suggest words for a search - whether or not understanding the least sum of squares is important for understanding evolutionary fitting first. If it is, then I thank you in advance for this information. However, it may be that a person could use the information at a site describing evolutionary fitting to learn what they need to know. By the way, you came across - at least to me, as though you were talking for everyone involved with this site. I'm not sure - you may be right, though. Do you think that people have to understand math before they can understand math concepts? Are all information sources that present math organized in this way? For example, I imagine that there are algorithms for identifying written signatures, fingerprints, retina scans, etc. - all of which involve math, but none of which necessitates that a person have an initial understanding of math before they can begin to understand what they do - and even how they do it. For example, a person could learn about an identification algorithm by learning about what most makes the things identified unique. This may not apply to understanding evolutionary fitting, though.

Also, when I visited the screen that is used to make posts, I noticed that not all posts are being displayed at the page that initially appears when a thread name is selected.

Hi again,

I myself dont know of any other name for it. Perhaps someone else does?

I suggested least sum of squares because it is somewhat popular and it
is widely used and will get you started with curve fitting in general.
It is used as the basis of other curve fitting techniques too.

What it looks like to me is that you are attempting to understand some of
the more advanced topics in mathematics before learning some of the basics.
This is going to be very hard to do if not impossible. Normally courses take
you through some rudimentary fundamentals and then you work your way
up to the more difficult stuff. If you try the difficult stuff first, you end up
constantly referring back to the other stuff anyway so you should really have
that under your belt first. It also makes life easier that way. You also end
up answering many of your own questions that way too and it's kinda fun to
do that.
Im not totally sure if what we have been calling 'evolutionary' fitting would do
you any good or not, but unfortunately you'll have to find some references
on the web in order to make any use at all of this, if it is possible.

If you cant find anything i can check around a bit and see if i can locate any
info on this...

Hi again,

I myself dont know of any other name for it. Perhaps someone else does?

I suggested least sum of squares because it is somewhat popular and it
is widely used and will get you started with curve fitting in general.
It is used as the basis of other curve fitting techniques too.

What it looks like to me is that you are attempting to understand some of
the more advanced topics in mathematics before learning some of the basics.
This is going to be very hard to do if not impossible. Normally courses take
you through some rudimentary fundamentals and then you work your way
up to the more difficult stuff. If you try the difficult stuff first, you end up
constantly referring back to the other stuff anyway so you should really have
that under your belt first. It also makes life easier that way. You also end
up answering many of your own questions that way too and it's kinda fun to
do that.
Im not totally sure if what we have been calling 'evolutionary' fitting would do
you any good or not, but unfortunately you'll have to find some references
on the web in order to make any use at all of this, if it is possible.

If you cant find anything i can check around a bit and see if i can locate any
info on this...

Thanks. Somethings I find math intimidating - and I may have gotten a little defensive for this reason. Evolutionary fitting would seem to me as though it would involve a curve that was known at first. What I don't understand is how a function could be mutating in such a way to be compared to the known curve. Adding or subtracting areas between intercepts might be useful information to identify how the curves differ at different segments in the domain. The only way that I can think of creating a function that would generate many terms that could potentially best represent a curve would be to use an infinite sequence. However, how could it be determined whether creating a curve that was more representative of the original function would involve changing the number of terms or the terms themselves? And how could the terms in the sequence be changed to compare all curves and the curve that is the best representation? Can all curves even be represented mathematically? Someone had mentioned animation earlier. I wonder if this would be an industry that has done a lot of research in the area. However, people in this industry might be able to use a formulas with ranges that are limited to representing movements that animals are normally limited to. Still, heart and back muscles might not be limited in this respect. Are there different classes of curves and shapes? I'd still be interested in knowing what they are - even though I'm going to start a new thread that is a little more specific.

Status
Not open for further replies. 