afterburner

2006-Aug-11, 12:53 AM

What are the three highest levels of math that humanity knows? As of 2006.

Thanks

Thanks

View Full Version : Highest level of math

afterburner

2006-Aug-11, 12:53 AM

What are the three highest levels of math that humanity knows? As of 2006.

Thanks

Thanks

01101001

2006-Aug-11, 01:40 AM

What is a level of math?

AGN Fuel

2006-Aug-11, 03:16 AM

The maths department at my alma mater was on the 5th & 6th floors. That's the highest level of maths in my personal experience, but I'm sure others can go better!

Kaptain K

2006-Aug-11, 03:21 AM

Dang! I posted to this thread. Where'd it go?

afterburner

2006-Aug-11, 03:31 AM

Come on people, you know what I mean...or do you? :eh:

As in...simple adding...arithmetics...algebra perhaps...geometry...calculus...more advanced geometry...matrices...(highest level)

I guess, whats the most advanced math? hmm...hope this helps.

As in...simple adding...arithmetics...algebra perhaps...geometry...calculus...more advanced geometry...matrices...(highest level)

I guess, whats the most advanced math? hmm...hope this helps.

George

2006-Aug-11, 03:33 AM

Kaptain K, there seemed to be some maintenance efforts I noticed late this afternoon. I lost a post,too, but I got some kind of message that something was being done to the system.

Tensor

2006-Aug-11, 03:47 AM

Come on people, you know what I mean...or do you? :eh:

As in...simple adding...arithmetics...algebra perhaps...geometry...calculus...more advanced geometry...matrices...(highest level)

I guess, whats the most advanced math? hmm...hope this helps.

How do you propose to quantify which is more advanced than another?

As in...simple adding...arithmetics...algebra perhaps...geometry...calculus...more advanced geometry...matrices...(highest level)

I guess, whats the most advanced math? hmm...hope this helps.

How do you propose to quantify which is more advanced than another?

George

2006-Aug-11, 03:49 AM

Come on people, you know what I mean...or do you? :eh:

As in...simple adding...arithmetics...algebra perhaps...geometry...calculus...more advanced geometry...matrices...(highest level)

I guess, whats the most advanced math? hmm...hope this helps.

It's been too long since my last course for me to offer any worthy suggestions of the most advanced math, but then, I would guess there is no ultimate math subject requiring all the others to be prerequisites. String Theory looks pretty tough, though. Non-linear math and GR might be worth mentioning.

It may be like asking what is the most advanced car or song; its level of sophistication is in the mind of the beholder.

As in...simple adding...arithmetics...algebra perhaps...geometry...calculus...more advanced geometry...matrices...(highest level)

I guess, whats the most advanced math? hmm...hope this helps.

It's been too long since my last course for me to offer any worthy suggestions of the most advanced math, but then, I would guess there is no ultimate math subject requiring all the others to be prerequisites. String Theory looks pretty tough, though. Non-linear math and GR might be worth mentioning.

It may be like asking what is the most advanced car or song; its level of sophistication is in the mind of the beholder.

Jeff Root

2006-Aug-11, 03:50 AM

At the other end of the spectrum, I think most people would

agree that distinguishing "something" from "nothing" is the simplest

or lowest level of math possible. (Essentially, counting to one.)

After that, I doubt that you will get much agreement.

-- Jeff, in Minneapolis

agree that distinguishing "something" from "nothing" is the simplest

or lowest level of math possible. (Essentially, counting to one.)

After that, I doubt that you will get much agreement.

-- Jeff, in Minneapolis

Elyk

2006-Aug-11, 03:54 AM

I think Jeff is right, understanding math, or being able to quantify things is probably as good as it gets.

AGN Fuel

2006-Aug-11, 03:55 AM

How do you propose to quantify which is more advanced than another?

By maths. :wall:

(Would that then be metamaths?? ;) )

By maths. :wall:

(Would that then be metamaths?? ;) )

George

2006-Aug-11, 03:56 AM

(Would that then be metamaths?? ;) )

Do I recall something about transcendental functions? ;)

Do I recall something about transcendental functions? ;)

Tensor

2006-Aug-11, 04:01 AM

By maths. :wall:

(Would that then be metamaths?? ;) )

:doh: ;)

(Would that then be metamaths?? ;) )

:doh: ;)

Tim Thompson

2006-Aug-11, 04:42 AM

What are the three highest levels of math that humanity knows? As of 2006.

I think it is safe to say that the question cannot be answered, on the grounds that it makes no sense. Mathematics simply does not work that way, there is no such thing as "the highest level" in mathematics.

You do have to differentiate between "pure" and "applied" mathematics. But beyond that, the concept of higher levels is simply not operative. The different "levels" are just different branches in the world of mathematics (http://mathworld.wolfram.com/).

I think it is safe to say that the question cannot be answered, on the grounds that it makes no sense. Mathematics simply does not work that way, there is no such thing as "the highest level" in mathematics.

You do have to differentiate between "pure" and "applied" mathematics. But beyond that, the concept of higher levels is simply not operative. The different "levels" are just different branches in the world of mathematics (http://mathworld.wolfram.com/).

snarkophilus

2006-Aug-11, 05:27 AM

Judging from what my math Ph.D. friends have trouble with, the toughest fields to learn anything more than the basics in are combinatorics and number theory. They're both hard because everything's discrete, which means that you often need to find a way to make what you're studying continuous in order to use most of the tools from other branches.

Analysis can also be quite difficult, because it requires a bit more creativity than the other branches, I think. But some people find it very easy.

My number theory class had a running joke: the easier a problem is to state, the harder it is to solve.

Try getting through even the first few pages of Wiles' Theorem (ie Fermat's Last Theorem) and you'll see what I mean. Apparently, the Birch & Swinnerton - Dyer Conjecture (http://www.claymath.org/millennium/) is related, but is even harder than that. Note that of the seven problems on that page, there are two problems, Birch and Riemann, related to zeta functions! Two more, the Hodge and Yang-Mills problems, are difficult in large part because of their discrete nature, and the essence of the Poincare problem is a discontinuity, or a discrete part (discretion?) of the object under consideration. Even the P vs NP problem will probably require advances in counting methods and proofs to solve.

A very hard (but solved) combinatorics problem, as an example:

http://en.wikipedia.org/wiki/Four_color_theorem

Using mathematical rules and procedures based on properties of reducible configurations (e.g. the method of discharging, rings, Kempe chains, etc.), Appel and Haken found an unavoidable set of reducible configurations, thus proving that a minimal counterexample to the four-color conjecture could not exist. Their proof reduced the infinitude of possible maps to 1,936 reducible configurations (later reduced to 1,476) which had to be checked one by one by computer. This reducibility part of the work was independently double checked with different programs and computers. However, the unavoidability part of the proof was over 500 pages of hand written counter-counter-examples, much of which was Haken's teenage son Lippold verifying graph colorings.

Ouch!

Analysis can also be quite difficult, because it requires a bit more creativity than the other branches, I think. But some people find it very easy.

My number theory class had a running joke: the easier a problem is to state, the harder it is to solve.

Try getting through even the first few pages of Wiles' Theorem (ie Fermat's Last Theorem) and you'll see what I mean. Apparently, the Birch & Swinnerton - Dyer Conjecture (http://www.claymath.org/millennium/) is related, but is even harder than that. Note that of the seven problems on that page, there are two problems, Birch and Riemann, related to zeta functions! Two more, the Hodge and Yang-Mills problems, are difficult in large part because of their discrete nature, and the essence of the Poincare problem is a discontinuity, or a discrete part (discretion?) of the object under consideration. Even the P vs NP problem will probably require advances in counting methods and proofs to solve.

A very hard (but solved) combinatorics problem, as an example:

http://en.wikipedia.org/wiki/Four_color_theorem

Using mathematical rules and procedures based on properties of reducible configurations (e.g. the method of discharging, rings, Kempe chains, etc.), Appel and Haken found an unavoidable set of reducible configurations, thus proving that a minimal counterexample to the four-color conjecture could not exist. Their proof reduced the infinitude of possible maps to 1,936 reducible configurations (later reduced to 1,476) which had to be checked one by one by computer. This reducibility part of the work was independently double checked with different programs and computers. However, the unavoidability part of the proof was over 500 pages of hand written counter-counter-examples, much of which was Haken's teenage son Lippold verifying graph colorings.

Ouch!

01101001

2006-Aug-11, 05:52 AM

Judging from what my math Ph.D. friends have trouble with, the toughest fields to learn anything more than the basics in are combinatorics and number theory.

Interesting. I did Computer Science, with a math minor, specializing in discrete math -- like computers do -- and combinatorics and number theory were joys. It all seemed so natural and logical. I never did like that slippery continuous stuff.

Most vivid memory I have of difficult math, for some, was a grad-level logic course, jointly offered by the math and philosophy deperatments. The first day there were several liberal arts folks in attendance, perhaps English majors, who were interested, I suppose, in rigorous logic. I never saw them after the first day.

Name drop: 4-color Haken's kid slept on my couch for a summer.

Interesting. I did Computer Science, with a math minor, specializing in discrete math -- like computers do -- and combinatorics and number theory were joys. It all seemed so natural and logical. I never did like that slippery continuous stuff.

Most vivid memory I have of difficult math, for some, was a grad-level logic course, jointly offered by the math and philosophy deperatments. The first day there were several liberal arts folks in attendance, perhaps English majors, who were interested, I suppose, in rigorous logic. I never saw them after the first day.

Name drop: 4-color Haken's kid slept on my couch for a summer.

PhantomWolf

2006-Aug-11, 07:32 AM

well the largest set of numbers are complex numbers (and you always thought you couldn't find the square root of -1) but otherwise.......

Saluki

2006-Aug-11, 02:14 PM

This question is like asking whether astrophysics is "higher" than biochemistry. At the cutting edge, both are extremely complex subjects. Each of us have our own strengths and weaknesses, so different people would have different responses as to which is "higher". The correct answer is "neither". We are trying to compare apples to oranges.

I think OP is used to the standard progression of math education that starts in primary school where we are first taught basic arithmetic, then basic algebra, geometry, analytic geometry, trig, calc, and diff eq. Each of these courses does indeed tend to be "tougher" than the previous course, but all are just building blocks that lead to "real" mathematics as practiced by mathematicians.

I think OP is used to the standard progression of math education that starts in primary school where we are first taught basic arithmetic, then basic algebra, geometry, analytic geometry, trig, calc, and diff eq. Each of these courses does indeed tend to be "tougher" than the previous course, but all are just building blocks that lead to "real" mathematics as practiced by mathematicians.

Kaptain K

2006-Aug-11, 02:49 PM

Someone mentioned "branches" of mathematics. That does not mean that you can simply follow a branch to its "end" and there find the "highest" level of that branch. It's more like a web than a tree. Sometimes the solution to a tricky problem in one field may be found in another (supposedly) unrelated field.

montebianco

2006-Aug-11, 02:53 PM

By maths. :wall:

(Would that then be metamaths?? ;) )

Well, it would be sufficient to impose a total ordering, and the highest level of maths could be whatever one chooses it to be. . .

(Would that then be metamaths?? ;) )

Well, it would be sufficient to impose a total ordering, and the highest level of maths could be whatever one chooses it to be. . .

montebianco

2006-Aug-11, 02:55 PM

well the largest set of numbers are complex numbers (and you always thought you couldn't find the square root of -1) but otherwise.......

Well, one can easily construct larger sets of numbers. For example, many at this forum have a number which is infinitely close to one, but not equal. . .

Well, one can easily construct larger sets of numbers. For example, many at this forum have a number which is infinitely close to one, but not equal. . .

JohnW

2006-Aug-11, 03:29 PM

Well, one can easily construct larger sets of numbers. For example, many at this forum have a number which is infinitely close to one, but not equal. . .

AAAAARGH! NO! NOT AGAIN!!!!!!!!

AAAAARGH! NO! NOT AGAIN!!!!!!!!

hhEb09'1

2006-Aug-11, 03:44 PM

Judging from what my math Ph.D. friends have trouble with, the toughest fields to learn anything more than the basics in are combinatorics and number theory. They're both hard because everything's discrete, which means that you often need to find a way to make what you're studying continuous in order to use most of the tools from other branches.As some people have already mentioned, those two fields are considered easy by some--but then, the combinatorics and number theory that appears in undergraduate computer science classes is easy compared to what your PhD friends are pursuing. That's another reason why there is no answer to the OP, even though it lists algebra on a lower level than geometry, there are problems in algebra that are so hard that no one has ever solved them. Same goes for arithmetic.

well the largest set of numbers are complex numbers (and you always thought you couldn't find the square root of -1) but otherwise.......The set of "ordinary" real numbers are just as large as the set of complex numbers. :)

well the largest set of numbers are complex numbers (and you always thought you couldn't find the square root of -1) but otherwise.......The set of "ordinary" real numbers are just as large as the set of complex numbers. :)

01101001

2006-Aug-11, 04:18 PM

As some people have already mentioned, those two fields are considered easy by some--but then, the combinatorics and number theory that appears in undergraduate computer science classes is easy compared to what your PhD friends are pursuing.

I don't know about others, but I was speaking of graduate level math courses in combinatorics and number theory, shoulder-to-shoulder with Math PhD-candidates, some of whom struggled, and some of whom didn't. Different strokes.

I don't know about others, but I was speaking of graduate level math courses in combinatorics and number theory, shoulder-to-shoulder with Math PhD-candidates, some of whom struggled, and some of whom didn't. Different strokes.

agingjb

2006-Aug-11, 06:49 PM

I suppose Category Theory could be, at the moment, the most inclusive and general branch of mathematics. But even if it is, I suspect something else will overtake it.

Jeff Root

2006-Aug-11, 07:22 PM

Someone mentioned "branches" of mathematics. That does not mean that

you can simply follow a branch to its "end" and there find the "highest"

level of that branch. It's more like a web than a tree.

And it's a multidimensional web. It might even have as many

dimensions as links!

-- Jeff, in Minneapolis

you can simply follow a branch to its "end" and there find the "highest"

level of that branch. It's more like a web than a tree.

And it's a multidimensional web. It might even have as many

dimensions as links!

-- Jeff, in Minneapolis

afterburner

2006-Aug-11, 10:27 PM

Ah, I see. Its more like a web than a tree, a multidimentional web. :D

Thanks for the answers!

Thanks for the answers!

Chuck

2006-Aug-12, 03:15 AM

AAAAARGH! NO! NOT AGAIN!!!!!!!!

That's the answer, though. The highest branch of mathematics is 1.0 > 0.999999... theory.

That's the answer, though. The highest branch of mathematics is 1.0 > 0.999999... theory.

cjl

2006-Aug-12, 03:19 AM

except that 1.0=0.999999... and it can be proven...

Wolverine

2006-Aug-12, 03:24 AM

Oh no... :doh:

montebianco

2006-Aug-12, 05:21 AM

except that 1.0=0.999999... and it can be proven...

Well, it can be proven given the standard definition of the real numbers. I'm sure the people who argue that they are not equal have carefully pondered the construction of the real numbers, its advantages and disadvantages, and reject the construction in favor of some alternative number system that they construct rigorously from a self-consistent set of axioms, and are not just arguing for inequality because they don't know what they're talking about. . .

Well, it can be proven given the standard definition of the real numbers. I'm sure the people who argue that they are not equal have carefully pondered the construction of the real numbers, its advantages and disadvantages, and reject the construction in favor of some alternative number system that they construct rigorously from a self-consistent set of axioms, and are not just arguing for inequality because they don't know what they're talking about. . .

jlhredshift

2006-Aug-12, 02:04 PM

John D Barrow in his book "The Book of Nothing" has a chart called the "Structure of Modern mathematics". It is wider than it is tall, but across the top he has:

1) Lie Groups

2) Lie Algebras

3) Differential Operators

4) Manifolds with Tensor Fields

5) Metric Manifolds

1) Lie Groups

2) Lie Algebras

3) Differential Operators

4) Manifolds with Tensor Fields

5) Metric Manifolds

ASEI

2006-Aug-12, 07:59 PM

I sometimes have trouble with very abstract areas of math. I like to have a mental picture of what I'm doing, and that gets hard when you do things at different/transformed levels. (Nyquist plots to solve PID control problems, for example.)

Something I'm good at is writing numerical solvers for various problems. I see an integral and think t=0; Int = 0; while(t<tend) {Int+=f(bla)*dt; t+=dt;}. Derivatives, gradients, laplacians, are similarly easy. I have a very literal idea of what is happening numerically. Searching solution spaces for maximums/minimums/intercepts is just a matter of applying an algorithm that evaluates points within the solution space.

I was wondering if mathematicians are good at generating these mental pictures, or if, at their level, it is all just abstract manipulation to them. How do you guys think of your problems?

Something I'm good at is writing numerical solvers for various problems. I see an integral and think t=0; Int = 0; while(t<tend) {Int+=f(bla)*dt; t+=dt;}. Derivatives, gradients, laplacians, are similarly easy. I have a very literal idea of what is happening numerically. Searching solution spaces for maximums/minimums/intercepts is just a matter of applying an algorithm that evaluates points within the solution space.

I was wondering if mathematicians are good at generating these mental pictures, or if, at their level, it is all just abstract manipulation to them. How do you guys think of your problems?

grav

2006-Aug-12, 10:17 PM

I sometimes have trouble with very abstract areas of math. I like to have a mental picture of what I'm doing, and that gets hard when you do things at different/transformed levels. (Nyquist plots to solve PID control problems, for example.)

Something I'm good at is writing numerical solvers for various problems. I see an integral and think t=0; Int = 0; while(t<tend) {Int+=f(bla)*dt; t+=dt;}. Derivatives, gradients, laplacians, are similarly easy. I have a very literal idea of what is happening numerically. Searching solution spaces for maximums/minimums/intercepts is just a matter of applying an algorithm that evaluates points within the solution space.

I was wondering if mathematicians are good at generating these mental pictures, or if, at their level, it is all just abstract manipulation to them. How do you guys think of your problems?

I'm pure algebra and some geometry. If I come up with a problem I can't figure out an equation for, I just plug the numbers into U-Basic and let the computer do the work. I try different values for all of the variables, along the range from lowest possiblities to the highest, and find a relationship from the results. I then test them with a few mid-range values. Used to be, I would sit for days and use up notebook after notebook trying to work out some of the most complex formulas. I can't even comprehend how I did that considering I have no patience for being stuck, and when I came up with a wrong answer I would spend just as long finding where I erred or simply starting over. Thank goodness for U-Basic!

By the way, I have come up with a way to find the formula for any summation to any value of n (number of summations) that do not include geometrical progressions, or irrationational or complex numbers, in the summation or as a limit. This probably doesn't leave much, but I have been looking for a challenge by which to test it. I can be given the progression for the summation itself or just a few of the results of the series to some value of n, but I have to know n as well (and every summation has to be performed in the same way). Any takers?

Something I'm good at is writing numerical solvers for various problems. I see an integral and think t=0; Int = 0; while(t<tend) {Int+=f(bla)*dt; t+=dt;}. Derivatives, gradients, laplacians, are similarly easy. I have a very literal idea of what is happening numerically. Searching solution spaces for maximums/minimums/intercepts is just a matter of applying an algorithm that evaluates points within the solution space.

I was wondering if mathematicians are good at generating these mental pictures, or if, at their level, it is all just abstract manipulation to them. How do you guys think of your problems?

I'm pure algebra and some geometry. If I come up with a problem I can't figure out an equation for, I just plug the numbers into U-Basic and let the computer do the work. I try different values for all of the variables, along the range from lowest possiblities to the highest, and find a relationship from the results. I then test them with a few mid-range values. Used to be, I would sit for days and use up notebook after notebook trying to work out some of the most complex formulas. I can't even comprehend how I did that considering I have no patience for being stuck, and when I came up with a wrong answer I would spend just as long finding where I erred or simply starting over. Thank goodness for U-Basic!

By the way, I have come up with a way to find the formula for any summation to any value of n (number of summations) that do not include geometrical progressions, or irrationational or complex numbers, in the summation or as a limit. This probably doesn't leave much, but I have been looking for a challenge by which to test it. I can be given the progression for the summation itself or just a few of the results of the series to some value of n, but I have to know n as well (and every summation has to be performed in the same way). Any takers?

Tensor

2006-Aug-13, 12:32 AM

I was wondering if mathematicians are good at generating these mental pictures, or if, at their level, it is all just abstract manipulation to them. How do you guys think of your problems?

It depends on the problem. Some are just manipulation. Some are helped by mental picutes. But, then, most of the problems I work with are in the geometric world, so the mental pictures can help, for me anyway.

It depends on the problem. Some are just manipulation. Some are helped by mental picutes. But, then, most of the problems I work with are in the geometric world, so the mental pictures can help, for me anyway.

montebianco

2006-Aug-13, 01:59 AM

It depends on the problem. Some are just manipulation. Some are helped by mental picutes. But, then, most of the problems I work with are in the geometric world, so the mental pictures can help, for me anyway.

I work with partial differential equations, and when I have nice clear picture forming, I know I am close to the solution. . .

I work with partial differential equations, and when I have nice clear picture forming, I know I am close to the solution. . .

Maksutov

2006-Aug-13, 11:34 AM

Come on people, you know what I mean...or do you? :eh:

As in...simple adding...arithmetics...algebra perhaps...geometry...calculus...more advanced geometry...matrices...(highest level)

I guess, whats the most advanced math? hmm...hope this helps.All the above, and more...

As in...simple adding...arithmetics...algebra perhaps...geometry...calculus...more advanced geometry...matrices...(highest level)

I guess, whats the most advanced math? hmm...hope this helps.All the above, and more...

Bad jcsd

2006-Aug-13, 01:54 PM

There isn't one as such, but catergory theory or soemthing simlair is probably the most genralized area of maths.

crosscountry

2006-Aug-14, 05:00 AM

how about this question:

what are the three most recent maths discovered. Calculus was hundreds of years ago.

what are the three most recent maths discovered. Calculus was hundreds of years ago.

jlhredshift

2006-Aug-14, 12:44 PM

how about this question:

what are the three most recent maths discovered. Calculus was hundreds of years ago.

My niece's checkbook because it is multi-variable, time shifted, missing data, and appears to call on metaphysics for resolution.

Thats one.

what are the three most recent maths discovered. Calculus was hundreds of years ago.

My niece's checkbook because it is multi-variable, time shifted, missing data, and appears to call on metaphysics for resolution.

Thats one.

hhEb09'1

2006-Aug-14, 02:02 PM

John D Barrow in his book "The Book of Nothing" has a chart called the "Structure of Modern mathematics". It is wider than it is tall, but across the top he has:

1) Lie Groups

2) Lie Algebras

3) Differential Operators

4) Manifolds with Tensor Fields

5) Metric Manifoldswhat does he have across the bottom?

1) Lie Groups

2) Lie Algebras

3) Differential Operators

4) Manifolds with Tensor Fields

5) Metric Manifoldswhat does he have across the bottom?

Argos

2006-Aug-14, 02:17 PM

There is a lowest math level, which is simple addition (or subtraction) of integers. We start maths study with them. The highest levels maths would correpond to increasingly higher levels of abstraction.

ASEI

2006-Aug-14, 03:11 PM

I was wondering:

Since one definition of multiplication (M*N) could be

K = 0; for(I = 0; I < N; I++; ) K = K + M;

and another definition for powering could be (M^N)

K = 0; for(I = 0; I < N; I++; ) K = K * M;

and so on for higher levels of this series which aren't often used

(M^^N): K = 0; for(I = 0; I < N; I++; ) K = K ^ M;

(M^^^N): K = 0; for(I = 0; I < N; I++; ) K = K^ ^ M; (Reaaly big numbers for any good value of N)

could you have some level of this series preceeding addition?

(M+N):

K = 0; for(I = 0; I < N; I++; ) K = K ?? M;

?

Since one definition of multiplication (M*N) could be

K = 0; for(I = 0; I < N; I++; ) K = K + M;

and another definition for powering could be (M^N)

K = 0; for(I = 0; I < N; I++; ) K = K * M;

and so on for higher levels of this series which aren't often used

(M^^N): K = 0; for(I = 0; I < N; I++; ) K = K ^ M;

(M^^^N): K = 0; for(I = 0; I < N; I++; ) K = K^ ^ M; (Reaaly big numbers for any good value of N)

could you have some level of this series preceeding addition?

(M+N):

K = 0; for(I = 0; I < N; I++; ) K = K ?? M;

?

Nowhere Man

2006-Aug-14, 04:16 PM

could you have some level of this series preceeding addition?

(M+N):

K = 0; for(I = 0; I < N; I++; ) K = K ?? M;

?

For addition, say this instead

(M+N):

K = M; for (I = 0; I < N; I++) K = K + 1;

or

K = 0; for (I = 0; I < N; I++) K = K + 1;

for (I = 0; I < M; I++) K = K + 1;

I don't know if your progressive series can be applied in this case.

Fred

(M+N):

K = 0; for(I = 0; I < N; I++; ) K = K ?? M;

?

For addition, say this instead

(M+N):

K = M; for (I = 0; I < N; I++) K = K + 1;

or

K = 0; for (I = 0; I < N; I++) K = K + 1;

for (I = 0; I < M; I++) K = K + 1;

I don't know if your progressive series can be applied in this case.

Fred

hhEb09'1

2006-Aug-14, 04:45 PM

could you have some level of this series preceeding addition?

(M+N):

K = 0; for(I = 0; I < N; I++; ) K = K ?? M;

?K = M; for(I = 0; I < N; I++; ) K = K ++; :)

(M+N):

K = 0; for(I = 0; I < N; I++; ) K = K ?? M;

?K = M; for(I = 0; I < N; I++; ) K = K ++; :)

Cougar

2006-Aug-14, 05:06 PM

What are the three highest levels of math that humanity knows? As of 2006.As noted, the "level" of math is not really quantifiable. Nevertheless, HERE (http://www.math.tamu.edu/teaching/graduate/courses.htm) is a listing of graduate level mathematics courses, which should give you some idea....

snarkophilus

2006-Aug-14, 07:06 PM

what are the three most recent maths discovered. Calculus was hundreds of years ago.

In terms of major branches? Numerical analysis has seen a resurgence since the advent of the computer (though perhaps Jacobi would argue that it's been around for much longer). Fractal theory/chaos theory is pretty recent, at most a hundred years or so old. Non-linear dynamics (which some argue is the same thing) -- heck, non-linear anything -- is pretty active now that we have computers, just because results were hard to come by before that.

Part of the problem here is that it's difficult to pin down exactly when a branch of math was discovered, because they all kind of merge. Sometimes, you get a big jump in development, like with Newton and Liebnitz in calculus, but the Greeks used some of the same concepts thousands of years earlier. It's just that the popular founders were the first to really sit down and delve deeply (it turns out that it doesn't always matter who publishes first :) ).

Computing theory is relatively recent... you could maybe attribute the start to the time of Babbage and Lovelace, but Godel, Turing, et al. made it what it is today. Topology is in the last hundred and fifty years, as is set theory.

In terms of major branches? Numerical analysis has seen a resurgence since the advent of the computer (though perhaps Jacobi would argue that it's been around for much longer). Fractal theory/chaos theory is pretty recent, at most a hundred years or so old. Non-linear dynamics (which some argue is the same thing) -- heck, non-linear anything -- is pretty active now that we have computers, just because results were hard to come by before that.

Part of the problem here is that it's difficult to pin down exactly when a branch of math was discovered, because they all kind of merge. Sometimes, you get a big jump in development, like with Newton and Liebnitz in calculus, but the Greeks used some of the same concepts thousands of years earlier. It's just that the popular founders were the first to really sit down and delve deeply (it turns out that it doesn't always matter who publishes first :) ).

Computing theory is relatively recent... you could maybe attribute the start to the time of Babbage and Lovelace, but Godel, Turing, et al. made it what it is today. Topology is in the last hundred and fifty years, as is set theory.

snarkophilus

2006-Aug-14, 07:31 PM

could you have some level of this series preceeding addition?

(M+N):

K = 0; for(I = 0; I < N; I++; ) K = K ?? M;

Well, there's the inverse power tower function. It's pretty much the slowest-growing function of them all. http://en.wikipedia.org/wiki/Power_tower

By definition, all numbers are constructed (usually) from the positive integers. So, learning to count and order numbers is the most basic math there is, sort of. You can do a fair amount of math without even that. But in a sense, there's no point in asking which series precedes or follows the other: they're all the same (the mathy term is isomorphic). We had a big argument about this somewhere in the general science thread a while back.

Here's an interesting way of counting, by the way: 3, 5, 7, 9, ..., 2*3, 2*5, 2*7, 2*9, ..., 2^2*3, 2^2*5, ..., 16, 8, 4, 2, 1.

(M+N):

K = 0; for(I = 0; I < N; I++; ) K = K ?? M;

Well, there's the inverse power tower function. It's pretty much the slowest-growing function of them all. http://en.wikipedia.org/wiki/Power_tower

By definition, all numbers are constructed (usually) from the positive integers. So, learning to count and order numbers is the most basic math there is, sort of. You can do a fair amount of math without even that. But in a sense, there's no point in asking which series precedes or follows the other: they're all the same (the mathy term is isomorphic). We had a big argument about this somewhere in the general science thread a while back.

Here's an interesting way of counting, by the way: 3, 5, 7, 9, ..., 2*3, 2*5, 2*7, 2*9, ..., 2^2*3, 2^2*5, ..., 16, 8, 4, 2, 1.

Nereid

2006-Aug-15, 01:23 AM

And what about work done at the very "lowest" level (the logical foundations of math)? Most definitely not for the faint of heart.

Ditto, the branch of metamathematics that studies the consistency of the axioms of various parts of math (and which lead to the entirely unexpected results of Cantor, on proof)? Is this a "low" level, or one of the most advanced "levels"?

If a result can be simply stated, simply understood, and simply proved, does that mean it is (automatically) a "low" level thing? Example: Russell's 'set of all sets' (http://www.cut-the-knot.org/selfreference/russell.shtml)

Ditto, the branch of metamathematics that studies the consistency of the axioms of various parts of math (and which lead to the entirely unexpected results of Cantor, on proof)? Is this a "low" level, or one of the most advanced "levels"?

If a result can be simply stated, simply understood, and simply proved, does that mean it is (automatically) a "low" level thing? Example: Russell's 'set of all sets' (http://www.cut-the-knot.org/selfreference/russell.shtml)

Chuck

2006-Aug-15, 02:16 AM

That's an easy one. The set of all sets that are not members of themselves does not exist. The assumption that it does leads to the contradiction that Russell found so the assumption is false. There's no such set.

Nereid

2006-Aug-15, 02:44 AM

That's an easy one. The set of all sets that are not members of themselves does not exist. The assumption that it does leads to the contradiction that Russell found so the assumption is false. There's no such set.Indeed - hence my question - in terms of the OP's 'levels', is this 'high', 'low', or 'not even on the map'?

It's only a century or so old, yet (so the story goes) a mathematician's life's work (on set theory) got trashed because of it (he'd assumed the existence of such a set, in the foundation part of his weighty tome).

Yet it's such a simple thing - easy to state, easy to prove (the contradiction), and easy to understand. QED (it's 'low level' maths).

Or is it like light - it exhibits 'low-high level duality'?

Or perhaps it's like itself - it can be used to prove that 'levels' cannot exist (without leading to contradictions)?

It's only a century or so old, yet (so the story goes) a mathematician's life's work (on set theory) got trashed because of it (he'd assumed the existence of such a set, in the foundation part of his weighty tome).

Yet it's such a simple thing - easy to state, easy to prove (the contradiction), and easy to understand. QED (it's 'low level' maths).

Or is it like light - it exhibits 'low-high level duality'?

Or perhaps it's like itself - it can be used to prove that 'levels' cannot exist (without leading to contradictions)?

Chuck

2006-Aug-15, 02:58 AM

With set theory in such disarray it's awfully brave of us to try to classify things right now.

snarkophilus

2006-Aug-15, 07:10 AM

With set theory in such disarray it's awfully brave of us to try to classify things right now.

How is set theory in disarray? Most, if not all, of the axiomatic difficulties encountered near the start of the 20th century have long since been resolved by the creation of axiomatic set theories. Plus, we restrict set theory to systems which obey those axioms, and we have category theory to cover the stuff that isn't covered by set theory.

Regarding bravery:

"Be bold, be bold

But not too bold

Lest that your heart's blood should run cold."

Good words for anyone in any branch of science, I think. :)

Or perhaps it's like itself - it can be used to prove that 'levels' cannot exist (without leading to contradictions)?

Most of the posts in this thread have suggested something like this (minus the proof part -- an interesting idea). If we're going to discuss this rigorously, rather than through anecdote and conjecture, we'll need to define "level" in a quantifiable way.

Maybe then we can develop the branch of mathematics that analyzes the branches of mathematics. I have to wonder if this branch will be able to analyze itself.... :D

How is set theory in disarray? Most, if not all, of the axiomatic difficulties encountered near the start of the 20th century have long since been resolved by the creation of axiomatic set theories. Plus, we restrict set theory to systems which obey those axioms, and we have category theory to cover the stuff that isn't covered by set theory.

Regarding bravery:

"Be bold, be bold

But not too bold

Lest that your heart's blood should run cold."

Good words for anyone in any branch of science, I think. :)

Or perhaps it's like itself - it can be used to prove that 'levels' cannot exist (without leading to contradictions)?

Most of the posts in this thread have suggested something like this (minus the proof part -- an interesting idea). If we're going to discuss this rigorously, rather than through anecdote and conjecture, we'll need to define "level" in a quantifiable way.

Maybe then we can develop the branch of mathematics that analyzes the branches of mathematics. I have to wonder if this branch will be able to analyze itself.... :D

Argos

2006-Aug-15, 02:22 PM

Poincaré Conjecture (http://www.nytimes.com/2006/08/15/science/15math.html?8dpc) solved. This is so high level... :)

If levels do not exist why do we start into maths by adding 1+1? Highly abstract concepts are surely more difficult to learn, or grasp. So it´s legitimate to say that mathematics has levels, yes.

If levels do not exist why do we start into maths by adding 1+1? Highly abstract concepts are surely more difficult to learn, or grasp. So it´s legitimate to say that mathematics has levels, yes.

Chuck

2006-Aug-16, 01:57 AM

It's not very satisfying to make up new axioms. You're making up a new set theory. The problems with the original set theory remain unsolved.

crosscountry

2006-Aug-16, 03:45 AM

Poincaré Conjecture (http://www.nytimes.com/2006/08/15/science/15math.html?8dpc) solved. This is so high level... :)

If levels do not exist why do we start into maths by adding 1+1? Highly abstract concepts are surely more difficult to learn, or grasp. So it´s legitimate to say that mathematics has levels, yes.

just cause you couldn't learn differential equations as a child doesn't mean others can't.:D

in this instance I think it comes down to definitions. What definitions do you need to go on? That's the question of orders. If one branch needs more definitions than another - then it's a higher level.

If levels do not exist why do we start into maths by adding 1+1? Highly abstract concepts are surely more difficult to learn, or grasp. So it´s legitimate to say that mathematics has levels, yes.

just cause you couldn't learn differential equations as a child doesn't mean others can't.:D

in this instance I think it comes down to definitions. What definitions do you need to go on? That's the question of orders. If one branch needs more definitions than another - then it's a higher level.

snarkophilus

2006-Aug-16, 07:52 AM

in this instance I think it comes down to definitions. What definitions do you need to go on? That's the question of orders. If one branch needs more definitions than another - then it's a higher level.

I disagree. Except for the fundamental axioms, definitions are arbitrary, and they usually exist to make things easier, not harder. For instance, if you're comparing the size of two objects, it's nice to define a unit of length. Then you can measure both objects, and do a simple comparison. If you don't create that definition, you have to lug those two objects around and compare directly. This is as true in math as it is in a physical sense: how could you compare the Golden Gate bridge and the Eiffel Tower without units of length? Not easily, I can assure you.

The hardest math has nothing to do with how abstract it is, or how much there is to learn, or any of that. Some people are better with abstract concepts than with actual numbers (I know I am). Some people are wizards at memorizing and connecting definitions.

The hardest math, in general, is the stuff on the frontiers, because you don't have the luxury of using well-established tools to solve your problems. Newton was a great man not just because he worked out gravity, but because he had to build a whole new set of tools (calculus) in order to solve his problems. The most advanced stuff is the stuff we don't know much about, the largely untested areas, in which existing results could still be wrong.

I disagree. Except for the fundamental axioms, definitions are arbitrary, and they usually exist to make things easier, not harder. For instance, if you're comparing the size of two objects, it's nice to define a unit of length. Then you can measure both objects, and do a simple comparison. If you don't create that definition, you have to lug those two objects around and compare directly. This is as true in math as it is in a physical sense: how could you compare the Golden Gate bridge and the Eiffel Tower without units of length? Not easily, I can assure you.

The hardest math has nothing to do with how abstract it is, or how much there is to learn, or any of that. Some people are better with abstract concepts than with actual numbers (I know I am). Some people are wizards at memorizing and connecting definitions.

The hardest math, in general, is the stuff on the frontiers, because you don't have the luxury of using well-established tools to solve your problems. Newton was a great man not just because he worked out gravity, but because he had to build a whole new set of tools (calculus) in order to solve his problems. The most advanced stuff is the stuff we don't know much about, the largely untested areas, in which existing results could still be wrong.

snarkophilus

2006-Aug-16, 08:04 AM

It's not very satisfying to make up new axioms. You're making up a new set theory. The problems with the original set theory remain unsolved.

It's not that they made up new axioms. It's that they based set theory on axioms, period. It is a scientific way of doing things, rather than just a bunch of hand-waving and saying, "oh yeah, that's a set," without proving that that's the case. Problems are only experienced when you're not sufficiently clear about what you are doing.

There's a reason why the old idea is called "naive set theory."

Look at it another way: naive set theory did the following. It said that a set was any collection of objects. Then it defined some operations, like union, membership, and intersection. Then it ran into trouble when someone proposed a set that contained itself.

The problem is in one of the assumptions. Maybe a set isn't any collection of objects, but some restricted set. If you assume the other assumptions to be true, then that must be the case (else you get a contradiction). Or maybe the trouble is in the operations, and you need to accept that set theory can't do as much as originally claimed (this path is what we now call category theory). But either way, naive set theory is wrong.

But that doesn't mean that the corrected systems are wrong. Knowing why the naive theory is wrong allows us to fix it. Hence, axiomatic set theory and category theory (and a few other ideas). Indeed, they are rigorously defined, proven correct (to the extent Godel allows), and also pretty useful.

Let me put it another way: the only problems with set theory are those that occur when you don't use set theory, but some primitive, unproven formulation of it. It's like taking a car out of the factory before they install the brakes. You're just asking for trouble.

It's not that they made up new axioms. It's that they based set theory on axioms, period. It is a scientific way of doing things, rather than just a bunch of hand-waving and saying, "oh yeah, that's a set," without proving that that's the case. Problems are only experienced when you're not sufficiently clear about what you are doing.

There's a reason why the old idea is called "naive set theory."

Look at it another way: naive set theory did the following. It said that a set was any collection of objects. Then it defined some operations, like union, membership, and intersection. Then it ran into trouble when someone proposed a set that contained itself.

The problem is in one of the assumptions. Maybe a set isn't any collection of objects, but some restricted set. If you assume the other assumptions to be true, then that must be the case (else you get a contradiction). Or maybe the trouble is in the operations, and you need to accept that set theory can't do as much as originally claimed (this path is what we now call category theory). But either way, naive set theory is wrong.

But that doesn't mean that the corrected systems are wrong. Knowing why the naive theory is wrong allows us to fix it. Hence, axiomatic set theory and category theory (and a few other ideas). Indeed, they are rigorously defined, proven correct (to the extent Godel allows), and also pretty useful.

Let me put it another way: the only problems with set theory are those that occur when you don't use set theory, but some primitive, unproven formulation of it. It's like taking a car out of the factory before they install the brakes. You're just asking for trouble.

ASEI

2006-Aug-16, 10:18 PM

Can't deal with things that contain themselves? Interesting. Some fractals contain scaled down versions of themselves. (Serpinski gasket, for example).

Peter Wilson

2006-Aug-16, 11:37 PM

except that 1.0=0.999999... and it can be proven...

You mean when I pay $3.299... for a gallon of gas, I'm really paying $3.30?

You mean when I pay $3.299... for a gallon of gas, I'm really paying $3.30?

PhantomWolf

2006-Aug-17, 12:45 AM

The set of "ordinary" real numbers are just as large as the set of complex numbers.

Hmmm, now here's a question. If I have an infinite set and you have an infinite set, but my set also includes all of yours, and also has members that yours doesn't, can it be classed as bigger?

edited to add: Essentially Real numbers are a subset of Imaginary Numbers, and so the set of Imaginary numbers must somehow be bigger, even though they are both infinite. A weird idea, but.......

Hmmm, now here's a question. If I have an infinite set and you have an infinite set, but my set also includes all of yours, and also has members that yours doesn't, can it be classed as bigger?

edited to add: Essentially Real numbers are a subset of Imaginary Numbers, and so the set of Imaginary numbers must somehow be bigger, even though they are both infinite. A weird idea, but.......

Cougar

2006-Aug-17, 01:01 AM

The problem is in one of the assumptions. Maybe a set isn't any collection of objects, but some restricted set. If you assume the other assumptions to be true, then that must be the case (else you get a contradiction). Or maybe the trouble is in the operations, and you need to accept that set theory can't do as much as originally claimed (this path is what we now call category theory). But either way, naive set theory is wrong.

But that doesn't mean that the corrected systems are wrong. Knowing why the naive theory is wrong allows us to fix it. Hence, axiomatic set theory and category theory (and a few other ideas). Indeed, they are rigorously defined, proven correct (to the extent Godel allows), and also pretty useful.

A Belgian friend I met in Japan who is also a working mathematician pointed me to the book Goodbye, Descartes by Keith Devlin, professor of mathematics, Dean of the School of Science at St. Mary's College, and senior researcher at Stanford University's Center for the Study of Language and Communication. One of those "few other ideas" is likely what Devlin has been working on, that is, Situation Theory. I expect this is one of those "higher levels of math" that is nevertheless considerably fundamental. A quote or two...

"In many respects, situation theory is an extension of classical logic [in the analysis of communication via language] that takes account of context."

"The real root of the [Liar] paradox was neither self reference nor truth, but an unacknowledged context... So when proper attention is paid to context, the Liar Paradox ceases to be a paradox. In saying "This assertion is false," the individual a is making a claim that refers (implicitly) to a particular context, c, and that leads to a contradiction. So the claim must be false. But the context for making the observation that the claim is false cannot be c, since if it were, then that too leads to a contradiction."

And here's one that might be a little controversial....

"The evidence continues to mount that the answers to the age-old questions concerning the nature of thought, communication, and action will not be found until we go beyond the boundaries imposed by the legacy of Plato, Aristotle, Descartes, and all the other great thinkers in that two-thousand-year intellectual tradition."

But that doesn't mean that the corrected systems are wrong. Knowing why the naive theory is wrong allows us to fix it. Hence, axiomatic set theory and category theory (and a few other ideas). Indeed, they are rigorously defined, proven correct (to the extent Godel allows), and also pretty useful.

A Belgian friend I met in Japan who is also a working mathematician pointed me to the book Goodbye, Descartes by Keith Devlin, professor of mathematics, Dean of the School of Science at St. Mary's College, and senior researcher at Stanford University's Center for the Study of Language and Communication. One of those "few other ideas" is likely what Devlin has been working on, that is, Situation Theory. I expect this is one of those "higher levels of math" that is nevertheless considerably fundamental. A quote or two...

"In many respects, situation theory is an extension of classical logic [in the analysis of communication via language] that takes account of context."

"The real root of the [Liar] paradox was neither self reference nor truth, but an unacknowledged context... So when proper attention is paid to context, the Liar Paradox ceases to be a paradox. In saying "This assertion is false," the individual a is making a claim that refers (implicitly) to a particular context, c, and that leads to a contradiction. So the claim must be false. But the context for making the observation that the claim is false cannot be c, since if it were, then that too leads to a contradiction."

And here's one that might be a little controversial....

"The evidence continues to mount that the answers to the age-old questions concerning the nature of thought, communication, and action will not be found until we go beyond the boundaries imposed by the legacy of Plato, Aristotle, Descartes, and all the other great thinkers in that two-thousand-year intellectual tradition."

Bluescreen

2006-Aug-17, 01:08 AM

Math is a symbolic representation of what happens in our real world. the highest level of mathematics that we have attempts to use pre-existing mathematical models to explain how things can exist beyond our current experience. Have you ever heard of a zero dimensional object? They exist, but they cannot be measured by the same scalers that we use as standards. So we represent them with a new symbol indicating that "this object must be measured differently". This is the highest form of mathematical expression, and it is not relegated to one brance of science. Where ever there is an unknowable but definite quantity, quality, or action, how we measure it is math at its best.

Cougar

2006-Aug-17, 02:27 AM

Math is a symbolic representation of what happens in our real world.

Actually, I think it goes beyond that. As Murray Gell-Mann puts it...

"Another way to look at mathematics is to regard applied mathematics as concerning itself with the structures that occur in scientific theory, while pure mathematics covers not only those structures but also all the ones that might have occurred (or might yet turn out to occur) in science."

All conceivable mathematical structures lie within its province, while the ones that are useful in describing natural phenomena are only a tiny subset of those that are or may be studied by mathematicians.

Actually, I think it goes beyond that. As Murray Gell-Mann puts it...

"Another way to look at mathematics is to regard applied mathematics as concerning itself with the structures that occur in scientific theory, while pure mathematics covers not only those structures but also all the ones that might have occurred (or might yet turn out to occur) in science."

All conceivable mathematical structures lie within its province, while the ones that are useful in describing natural phenomena are only a tiny subset of those that are or may be studied by mathematicians.

ASEI

2006-Aug-17, 02:41 AM

Math is the art of being consistent. The great thing about being completely consistent is that you don't find any contradictions later.

snarkophilus

2006-Aug-17, 07:21 AM

Hmmm, now here's a question. If I have an infinite set and you have an infinite set, but my set also includes all of yours, and also has members that yours doesn't, can it be classed as bigger?

Yes, sometimes. Cantor proved this. Basically, each set has something called cardinality, and two sets with the same cardinality can be mapped to each other, element for element. For instance, there is no way to take the integers and reals and pair them up, element by element. There is, however, a way to pair up the integers and rationals (fractions). Therefore, integers and rationals are the same size, but the set of reals is bigger than both.

When dealing with infinite numbers, things can get confusing. For example, there are exactly the same number of even integers are there are integers. If the sets were finite, you'd get twice as many integers as even ones, but because they are infinite, there are exactly as many.

As another example, suppose you take the set containing all the integers. I take all the elements of your set and put them in mine, and then sprinkle a couple of fractions in there, and maybe a car or a boat. My set is still the same size as yours.

There's a neat and related thing called the continuum hypothesis (http://en.wikipedia.org/wiki/Continuum_hypothesis). I think that it would be very interesting to do math assuming that it isn't true, but I haven't seen much in that vein. Most people like it a lot.

There's also something called the Lebesgue measure that can be used to measure the "size" of infinite sets. Some sets, however, are non-measurable, and you get cool things like the Banach-Tarski paradox (http://en.wikipedia.org/wiki/Banach%E2%80%93Tarski_paradox).

Yes, sometimes. Cantor proved this. Basically, each set has something called cardinality, and two sets with the same cardinality can be mapped to each other, element for element. For instance, there is no way to take the integers and reals and pair them up, element by element. There is, however, a way to pair up the integers and rationals (fractions). Therefore, integers and rationals are the same size, but the set of reals is bigger than both.

When dealing with infinite numbers, things can get confusing. For example, there are exactly the same number of even integers are there are integers. If the sets were finite, you'd get twice as many integers as even ones, but because they are infinite, there are exactly as many.

As another example, suppose you take the set containing all the integers. I take all the elements of your set and put them in mine, and then sprinkle a couple of fractions in there, and maybe a car or a boat. My set is still the same size as yours.

There's a neat and related thing called the continuum hypothesis (http://en.wikipedia.org/wiki/Continuum_hypothesis). I think that it would be very interesting to do math assuming that it isn't true, but I haven't seen much in that vein. Most people like it a lot.

There's also something called the Lebesgue measure that can be used to measure the "size" of infinite sets. Some sets, however, are non-measurable, and you get cool things like the Banach-Tarski paradox (http://en.wikipedia.org/wiki/Banach%E2%80%93Tarski_paradox).

Jeff Root

2006-Aug-17, 08:02 AM

Look at it another way: naive set theory did the following. It

said that a set was any collection of objects. Then it defined

some operations, like union, membership, and intersection. Then

it ran into trouble when someone proposed a set that contained

itself.

The problem is in one of the assumptions. Maybe a set isn't any

collection of objects, but some restricted set.

Smoke is coming out of my ears! Somebody, quick! Call Douglas

Hofstadter!!!

-- Jeff, in Minneapolis

said that a set was any collection of objects. Then it defined

some operations, like union, membership, and intersection. Then

it ran into trouble when someone proposed a set that contained

itself.

The problem is in one of the assumptions. Maybe a set isn't any

collection of objects, but some restricted set.

Smoke is coming out of my ears! Somebody, quick! Call Douglas

Hofstadter!!!

-- Jeff, in Minneapolis

montebianco

2006-Aug-17, 09:47 AM

The set of "ordinary" real numbers are just as large as the set of complex numbers.

Hmmm, now here's a question. If I have an infinite set and you have an infinite set, but my set also includes all of yours, and also has members that yours doesn't, can it be classed as bigger?

snarkophilus has a good answer, I just want to add that it depends on how you define "bigger." If you use a strict superset/subset relation, e.g., every integer is also a rational, but some rationals are not integers, then by this relation you could claim that the set of rationals is "bigger" than the set of integers. But the answer given by snarkophilus (and some other earlier comments) uses a definition by which two sets are considered to be of equal size if there is a one-to-one mapping between them. A perhaps counter-intuitive result is that an infinite set can often be mapped one-to-one to a proper subset of itself. An even simpler example than the real vs. complex or integers vs. rational would be the set of integers versus the set of even integers. Clearly the set of even integers is a proper subset of the set of integers, so by the definition towards which you seem inclined, the set of integers would indeed be "bigger." But there exists a one-to-one mapping, map each integer to its double, and this is a one-to-one mapping between the set of integers and the set of even integers. For each integer, there is a corresponding even integer, and for each even integer, there is a corresponding integer. So using cardinality to determine which set is "bigger," we would conclude that these two sets are of the same size, even though one clearly includes everything the other includes and more.

Hmmm, now here's a question. If I have an infinite set and you have an infinite set, but my set also includes all of yours, and also has members that yours doesn't, can it be classed as bigger?

snarkophilus has a good answer, I just want to add that it depends on how you define "bigger." If you use a strict superset/subset relation, e.g., every integer is also a rational, but some rationals are not integers, then by this relation you could claim that the set of rationals is "bigger" than the set of integers. But the answer given by snarkophilus (and some other earlier comments) uses a definition by which two sets are considered to be of equal size if there is a one-to-one mapping between them. A perhaps counter-intuitive result is that an infinite set can often be mapped one-to-one to a proper subset of itself. An even simpler example than the real vs. complex or integers vs. rational would be the set of integers versus the set of even integers. Clearly the set of even integers is a proper subset of the set of integers, so by the definition towards which you seem inclined, the set of integers would indeed be "bigger." But there exists a one-to-one mapping, map each integer to its double, and this is a one-to-one mapping between the set of integers and the set of even integers. For each integer, there is a corresponding even integer, and for each even integer, there is a corresponding integer. So using cardinality to determine which set is "bigger," we would conclude that these two sets are of the same size, even though one clearly includes everything the other includes and more.

Argos

2006-Aug-17, 11:55 AM

[B]edited to add: Essentially Real numbers are a subset of Imaginary Numbers, and so the set of Imaginary numbers must somehow be bigger, even though they are both infinite. A weird idea, but.......

I´d say that both real and imaginary numbers are subsets of complex numbers. Indeed, a real number can be regarded as a complex number with an imaginary part equal to zero (r + i0).

I´d say that both real and imaginary numbers are subsets of complex numbers. Indeed, a real number can be regarded as a complex number with an imaginary part equal to zero (r + i0).

ASEI

2006-Aug-17, 02:12 PM

In one set you have two degrees of freedom, in the other you only have one. Shouldn't that count for one set being "larger" in the same sense that a plane contains infinite lines (even though they both contain infinite points)?

jlhredshift

2006-Aug-17, 02:50 PM

In my humble opinion, I would suggest that the highest level of math would be one with both explanatory and preditictive power for how the entire universe works. Of course this is cliche and is the goal of the GUT's and we do not know how simple or complex it might be. Still, if we ever could achieve this goal I think it would qualify. At this point in our search we are not even positive about how many dimensions that we exist in and could we be inside some other "space" that we are not aware of.

I agree that it has to be "turtles all the way down". But, which way is "down"?

I agree that it has to be "turtles all the way down". But, which way is "down"?

Chuck

2006-Aug-17, 02:56 PM

Numbers are represented by individual points so that's all that needs to be compared. Two infinite sets are considered to be the same size if their members can be put into one to one correspondence with each other. This can be done with the reals and the complex numbers.

"This statement is false" is not really a paradox. Some statements have no truth value. "This statement is true" is another such statement. This should not be confused with statements of unknown truth value. "I will win the next PowerBall lottery" is either true or false but we don't know which. The truth or lack thereof of "This statement is false" is not unknown. It is neither true nor false.

"This statement is false" is not really a paradox. Some statements have no truth value. "This statement is true" is another such statement. This should not be confused with statements of unknown truth value. "I will win the next PowerBall lottery" is either true or false but we don't know which. The truth or lack thereof of "This statement is false" is not unknown. It is neither true nor false.

ASEI

2006-Aug-17, 03:52 PM

Two infinite sets are considered to be the same size if their members can be put into one to one correspondence with each other. This can be done with the reals and the complex numbers.

This is what I'm having difficulty seeing though. You can't put points on a line into one-to-one correspondance with all points on a plane, can you?

This is what I'm having difficulty seeing though. You can't put points on a line into one-to-one correspondance with all points on a plane, can you?

montebianco

2006-Aug-17, 07:32 PM

This is what I'm having difficulty seeing though. You can't put points on a line into one-to-one correspondance with all points on a plane, can you?

Here (http://www.physicsforums.com/showthread.php?t=54189) is someone's attempt to do just that, with some comments on some issues that arise. Of course, since a lot of people at this forum believe 4.999~ and 5 are different numbers, they shouldn't have any problem with the proposed method.

Here (http://www.physicsforums.com/showthread.php?t=54189) is someone's attempt to do just that, with some comments on some issues that arise. Of course, since a lot of people at this forum believe 4.999~ and 5 are different numbers, they shouldn't have any problem with the proposed method.

ASEI

2006-Aug-17, 08:33 PM

That's not right. He's just mapping the numbers a+ai to the set of real numbers, not the entire plane.

Ah - nevermind. I think I see what he's trying to do. I still don't like trying to put a two dimensional object in only one dimension though.

Ah - nevermind. I think I see what he's trying to do. I still don't like trying to put a two dimensional object in only one dimension though.

snarkophilus

2006-Aug-17, 09:55 PM

This can be done with the reals and the complex numbers.

Yup... in fact, it can be done with the reals and any finite dimensional real space. That means that there must be a line of infinite length which, when drawn inside a square, fills the square completely. It also means that a similar line exists for a cube, and for a hypercube, and so on.

I still don't like trying to put a two dimensional object in only one dimension though.

You don't have to like it. You just have to believe it. :D

There is a suggestion that all of the information in a black hole is stored at its surface. That would be a reduction from a three dimensional space to a two-dimensional space, in the real world!

Yup... in fact, it can be done with the reals and any finite dimensional real space. That means that there must be a line of infinite length which, when drawn inside a square, fills the square completely. It also means that a similar line exists for a cube, and for a hypercube, and so on.

I still don't like trying to put a two dimensional object in only one dimension though.

You don't have to like it. You just have to believe it. :D

There is a suggestion that all of the information in a black hole is stored at its surface. That would be a reduction from a three dimensional space to a two-dimensional space, in the real world!

ASEI

2006-Aug-17, 10:26 PM

I thought that was where the whole concept of fractal dimension came into play. You can have a fractal line that sort of fills a plane. As the fractal dimension proceeds from 1 to 2, the crazy-curve-shape fills the plane with greater and greater density. Here's the thing though - you need more than one finite number to describe your position along a crazy-curve-shape of dimension > 1, because the curve is infinitely convoluted. Just one real number won't do, unless you plan returning infinity for any deviation whose linear distance from the base-point is finite.

hhEb09'1

2006-Aug-17, 11:28 PM

The set of "ordinary" real numbers are just as large as the set of complex numbers.

Hmmm, now here's a question. If I have an infinite set and you have an infinite set, but my set also includes all of yours, and also has members that yours doesn't, can it be classed as bigger?

edited to add: Essentially Real numbers are a subset of Imaginary Numbers, and so the set of Imaginary numbers must somehow be bigger, even though they are both infinite. A weird idea, but.......Some others have answered your question of my statement, but I thought another example might be helpful.

The set {1,2} is smaller than {1,2,3}, because it is contained in it, right? But what about {1,2} and {2,3}? Neither is contained in the other, but they both have the same number of elements. Similarly, it is obvious that the odd positive integers have the same number of elements as the even positive integers, right? That's because for every odd number, there is an even number (the number just 1 more than it).

But, as montebianco mentions, we can say the same thing for the integers and the even integers. Although the even integers are completely contained in the integers, there is still one unique even integer for every single integer, and vice versa. That's why we say the sets are the same size: the size of the integers is the same as the size of the even integers.

Hmmm, now here's a question. If I have an infinite set and you have an infinite set, but my set also includes all of yours, and also has members that yours doesn't, can it be classed as bigger?

edited to add: Essentially Real numbers are a subset of Imaginary Numbers, and so the set of Imaginary numbers must somehow be bigger, even though they are both infinite. A weird idea, but.......Some others have answered your question of my statement, but I thought another example might be helpful.

The set {1,2} is smaller than {1,2,3}, because it is contained in it, right? But what about {1,2} and {2,3}? Neither is contained in the other, but they both have the same number of elements. Similarly, it is obvious that the odd positive integers have the same number of elements as the even positive integers, right? That's because for every odd number, there is an even number (the number just 1 more than it).

But, as montebianco mentions, we can say the same thing for the integers and the even integers. Although the even integers are completely contained in the integers, there is still one unique even integer for every single integer, and vice versa. That's why we say the sets are the same size: the size of the integers is the same as the size of the even integers.

ASEI

2006-Aug-18, 12:28 AM

That's why we say the sets are the same size: the size of the integers is the same as the size of the even integers. How so? There are two integers for every even integer. If you move along the real number line counting integers, and even integers, the number of hits in one is twice the number of hits in the other. Given x and y arbitrarily far apart on the real number line, the amount of even integers contained is half the number of all integers. Or how about this: The set of all integers with the set of even integers excluded still has members. Because it contains all even integers, these members are over and above what is found in the even integer set. How can this not be the case?

For that matter, the set of all real numbers is contained in the set of imaginary numbers. The fact that imaginary numbers contain additional elements not occuring in the set of real numbers, as well as the set of all real numbers, means the set is larger. How you pose things like this, and not encounter gobbledegook down the road is beyond me.

For that matter, the set of all real numbers is contained in the set of imaginary numbers. The fact that imaginary numbers contain additional elements not occuring in the set of real numbers, as well as the set of all real numbers, means the set is larger. How you pose things like this, and not encounter gobbledegook down the road is beyond me.

montebianco

2006-Aug-18, 12:42 AM

How so? There are two integers for every even integer.

That's true - you can map each even integer to two integers, for example, you can associate 2*N (an even integer) with 2*N-1 and 2*N (both integers). For each even integer, there are two integers, and for each two integers, there is one even integer.

However, as I indicated before, it is also possible to associate the two sets in a one-to-one manner. Associated 2*N (an even integer) with N (an integer). For each even integer, there is a corresponding integer, and for each integer, there is a corresponding even integer.

So there are two maps. In one, each even integer is associated with two integers, and in the other, each even integer is associated with one integer. This is a property of infinite sets - you can match them up in different ways.

We say two sets have the same cardinality if there exists a one-to-one mapping between them. This does not preclude the existence of other maps which are not one-to-one.

If you move along the real number line counting integers, and even integers, the number of hits in one is twice the number of hits in the other.

True enough, if you line the two sets up in this particular way. If you line them up differently, you get a different answer.

For that matter, the set of all real numbers is contained in the set of imaginary numbers. The fact that imaginary numbers

This is not the case, but would be if you replace "imaginary" by "complex."

contain additional elements not occuring in the set of real numbers, as well as the set of all real numbers, means the set is larger.

See my earlier post. If you choose to define "larger" by a strict superset/subser relation, then it is larger. If you use the cardinality definition, then it isn't.

How you pose things like this, and not encounter gobbledegook down the road is beyond me.

The way you avoid gobbleygook is by defining the terms you use precisely, including what it means for one set to be "larger" than another. If two people use two different definitions of what it means to be "larger," then it would not be surprising if they were to come to different conclusions about the size of two particular given sets.

That's true - you can map each even integer to two integers, for example, you can associate 2*N (an even integer) with 2*N-1 and 2*N (both integers). For each even integer, there are two integers, and for each two integers, there is one even integer.

However, as I indicated before, it is also possible to associate the two sets in a one-to-one manner. Associated 2*N (an even integer) with N (an integer). For each even integer, there is a corresponding integer, and for each integer, there is a corresponding even integer.

So there are two maps. In one, each even integer is associated with two integers, and in the other, each even integer is associated with one integer. This is a property of infinite sets - you can match them up in different ways.

We say two sets have the same cardinality if there exists a one-to-one mapping between them. This does not preclude the existence of other maps which are not one-to-one.

If you move along the real number line counting integers, and even integers, the number of hits in one is twice the number of hits in the other.

True enough, if you line the two sets up in this particular way. If you line them up differently, you get a different answer.

For that matter, the set of all real numbers is contained in the set of imaginary numbers. The fact that imaginary numbers

This is not the case, but would be if you replace "imaginary" by "complex."

contain additional elements not occuring in the set of real numbers, as well as the set of all real numbers, means the set is larger.

See my earlier post. If you choose to define "larger" by a strict superset/subser relation, then it is larger. If you use the cardinality definition, then it isn't.

How you pose things like this, and not encounter gobbledegook down the road is beyond me.

The way you avoid gobbleygook is by defining the terms you use precisely, including what it means for one set to be "larger" than another. If two people use two different definitions of what it means to be "larger," then it would not be surprising if they were to come to different conclusions about the size of two particular given sets.

crosscountry

2006-Aug-18, 02:25 AM

I disagree. Except for the fundamental axioms, definitions are arbitrary, and they usually exist to make things easier, not harder. For instance, if you're comparing the size of two objects, it's nice to define a unit of length. Then you can measure both objects, and do a simple comparison. If you don't create that definition, you have to lug those two objects around and compare directly. This is as true in math as it is in a physical sense: how could you compare the Golden Gate bridge and the Eiffel Tower without units of length? Not easily, I can assure you.

The hardest math has nothing to do with how abstract it is, or how much there is to learn, or any of that. Some people are better with abstract concepts than with actual numbers (I know I am). Some people are wizards at memorizing and connecting definitions.

The hardest math, in general, is the stuff on the frontiers, because you don't have the luxury of using well-established tools to solve your problems. Newton was a great man not just because he worked out gravity, but because he had to build a whole new set of tools (calculus) in order to solve his problems. The most advanced stuff is the stuff we don't know much about, the largely untested areas, in which existing results could still be wrong.

hmmmm.... Well, the question isn't hardest math but highest level. Even the most cutting edge math still uses definitions. Sure it may be harder because of its relative newness, but that doesn't make it higher level.

Your last sentence I disagree with completely.

The hardest math has nothing to do with how abstract it is, or how much there is to learn, or any of that. Some people are better with abstract concepts than with actual numbers (I know I am). Some people are wizards at memorizing and connecting definitions.

The hardest math, in general, is the stuff on the frontiers, because you don't have the luxury of using well-established tools to solve your problems. Newton was a great man not just because he worked out gravity, but because he had to build a whole new set of tools (calculus) in order to solve his problems. The most advanced stuff is the stuff we don't know much about, the largely untested areas, in which existing results could still be wrong.

hmmmm.... Well, the question isn't hardest math but highest level. Even the most cutting edge math still uses definitions. Sure it may be harder because of its relative newness, but that doesn't make it higher level.

Your last sentence I disagree with completely.

snarkophilus

2006-Aug-18, 05:51 AM

How so? There are two integers for every even integer.

Your problem is that you are not thinking in terms of infinity. What you say is true only if you consider only the numbers on some finite interval. Let's say we're thinking of the integers from 0 to n. In that case, there are (about) twice as many integers as even integers.

But that logic doesn't work when you deal with infinity. For every integer from 0 to n, there is an even number in the set of all integers. It doesn't matter how big you set n to be, there is always a 2n in the integers. Another way of looking at it is to say that 2 x infinity = infinity.

And the vagaries of infinity mean that you can go further, too. You can show that for each integer, there are two even integers (4n and 4n+2, for instance). Or three, or four, or any finite number.

So what does that mean? Your idea means that there are at least as many integers as even integers. But the mapping that assigns two even integers to each integer means that there are at least as many even integers as integers. The only logical conclusion is that there are just as many integers as even integers.

Your problem is that you are not thinking in terms of infinity. What you say is true only if you consider only the numbers on some finite interval. Let's say we're thinking of the integers from 0 to n. In that case, there are (about) twice as many integers as even integers.

But that logic doesn't work when you deal with infinity. For every integer from 0 to n, there is an even number in the set of all integers. It doesn't matter how big you set n to be, there is always a 2n in the integers. Another way of looking at it is to say that 2 x infinity = infinity.

And the vagaries of infinity mean that you can go further, too. You can show that for each integer, there are two even integers (4n and 4n+2, for instance). Or three, or four, or any finite number.

So what does that mean? Your idea means that there are at least as many integers as even integers. But the mapping that assigns two even integers to each integer means that there are at least as many even integers as integers. The only logical conclusion is that there are just as many integers as even integers.

snarkophilus

2006-Aug-18, 05:55 AM

hmmmm.... Well, the question isn't hardest math but highest level. Even the most cutting edge math still uses definitions. Sure it may be harder because of its relative newness, but that doesn't make it higher level.

Your last sentence I disagree with completely.

You are right. I just assumed that high level and hard were equivalent, which is not necessarily the case. My point was simply to say that the branch with the most definitions is not necessarily the most advanced, which was claimed previously.

I've been thinking about it, and I'm not certain I entirely agree with that statement any more... I do in one sense, but don't in another. For reference:

The most advanced stuff is the stuff we don't know much about, the largely untested areas, in which existing results could still be wrong.

But I'm curious as to why you disagree.

Your last sentence I disagree with completely.

You are right. I just assumed that high level and hard were equivalent, which is not necessarily the case. My point was simply to say that the branch with the most definitions is not necessarily the most advanced, which was claimed previously.

I've been thinking about it, and I'm not certain I entirely agree with that statement any more... I do in one sense, but don't in another. For reference:

The most advanced stuff is the stuff we don't know much about, the largely untested areas, in which existing results could still be wrong.

But I'm curious as to why you disagree.

Ken G

2006-Aug-18, 09:03 AM

"This statement is false" is not really a paradox. Some statements have no truth value. "This statement is true" is another such statement. This should not be confused with statements of unknown truth value.

If you think about it, the distinction between unknown and unknowable truth value is still not enough to escape the paradox, however, as we can consider either statement: "this statement is not known to be true" or "this statement cannot be known to be true" and achieve the same paradox. The real problem is deeper than having statements that are undecidable-- self-referential statements are simply not allowable in a formal logical system. The paradox comes from the fact that our ability to use logic actually extends outside the confines of formal logic, and we usually get away with that for some pretty deep reason that I cannot fathom, but occasionally we get bitten and are surprised because we are used to not having to follow strictly the rules of self-contained logical systems. Why such formal logical systems have anything to do with reality in the first place is a related, and equally unfathomable, issue.

If you think about it, the distinction between unknown and unknowable truth value is still not enough to escape the paradox, however, as we can consider either statement: "this statement is not known to be true" or "this statement cannot be known to be true" and achieve the same paradox. The real problem is deeper than having statements that are undecidable-- self-referential statements are simply not allowable in a formal logical system. The paradox comes from the fact that our ability to use logic actually extends outside the confines of formal logic, and we usually get away with that for some pretty deep reason that I cannot fathom, but occasionally we get bitten and are surprised because we are used to not having to follow strictly the rules of self-contained logical systems. Why such formal logical systems have anything to do with reality in the first place is a related, and equally unfathomable, issue.

Ken G

2006-Aug-18, 09:15 AM

But I'm curious as to why you disagree.

I'd have to guess on crosscountry's behalf, but on my own behalf, I would have trouble seeing a mathematical "result" as being possibly "wrong". If a suggested proof is uncertain, then I would tend to not characterize it as a "result" but rather a "conjecture". But I grant you, there is the sticky issue of when is a proof really a proof, and I have no desire to get into that question!

I'd have to guess on crosscountry's behalf, but on my own behalf, I would have trouble seeing a mathematical "result" as being possibly "wrong". If a suggested proof is uncertain, then I would tend to not characterize it as a "result" but rather a "conjecture". But I grant you, there is the sticky issue of when is a proof really a proof, and I have no desire to get into that question!

montebianco

2006-Aug-18, 09:23 AM

You can show that for each integer, there are two even integers (4n and 4n+2, for instance).

Heh heh, you beat me to it :D

I suspect what gets people into trouble is that (as you suggest) intuition from finite sets doesn't work on infinite sets. If two finite sets can be put into one-to-one correspondence, then they can't be put into two-to-one or one-to-two correspondence (unless each is empty :D ). Not so with infinite sets. . .

Heh heh, you beat me to it :D

I suspect what gets people into trouble is that (as you suggest) intuition from finite sets doesn't work on infinite sets. If two finite sets can be put into one-to-one correspondence, then they can't be put into two-to-one or one-to-two correspondence (unless each is empty :D ). Not so with infinite sets. . .

snarkophilus

2006-Aug-18, 09:50 AM

I'd have to guess on crosscountry's behalf, but on my own behalf, I would have trouble seeing a mathematical "result" as being possibly "wrong". If a suggested proof is uncertain, then I would tend to not characterize it as a "result" but rather a "conjecture". But I grant you, there is the sticky issue of when is a proof really a proof, and I have no desire to get into that question!

Ah, but there have been many cases where a published result has stood for quite some time before it has been shown to be wrong. For instance, the four colour theorem was falsely thought to be proven twice in the ninteenth century, and each of those "proofs" stood for eleven years.

So part of the issue is that for complicated results, you're never really sure until some time has passed and a number of people have not just looked at the new thing, but have used it and become familiar with it. At least, I assume that mathematicians think that way. Physical scientists certainly do.

And if that result is integral to your result, well....

But it wasn't even of false proofs in particular that I was thinking when I wrote that. I was thinking more in terms of false paths. A while back in the General Science forum, someone posted a problem regarding particle flux. He was using Cartesian coordinates, and wanted a differential equation. The way he phrased the problem, it is almost impossible to solve. However, a simple switch to polar coordinates makes the problem almost trivial. So the result (and I realise it's a poor choice of word, but I think it's still applicable) upon which he is basing his current work is that of integration using Cartesian coordinates. It's not wrong per se, but it's wrong in the context of his problem.

And that's what makes the unexplored areas difficult. You don't know which tools work, and although someone might have something that works in some cases, there's no guarantee that it works for your particular problem. You could spend a substantial amount of time going down the wrong path, like when Wiles abandoned some of his early work in favour of new developments (ones he didn't create, and thus wasn't quite as intimately familiar with), one of which caused a fatal error in his proof.

Ah, but there have been many cases where a published result has stood for quite some time before it has been shown to be wrong. For instance, the four colour theorem was falsely thought to be proven twice in the ninteenth century, and each of those "proofs" stood for eleven years.

So part of the issue is that for complicated results, you're never really sure until some time has passed and a number of people have not just looked at the new thing, but have used it and become familiar with it. At least, I assume that mathematicians think that way. Physical scientists certainly do.

And if that result is integral to your result, well....

But it wasn't even of false proofs in particular that I was thinking when I wrote that. I was thinking more in terms of false paths. A while back in the General Science forum, someone posted a problem regarding particle flux. He was using Cartesian coordinates, and wanted a differential equation. The way he phrased the problem, it is almost impossible to solve. However, a simple switch to polar coordinates makes the problem almost trivial. So the result (and I realise it's a poor choice of word, but I think it's still applicable) upon which he is basing his current work is that of integration using Cartesian coordinates. It's not wrong per se, but it's wrong in the context of his problem.

And that's what makes the unexplored areas difficult. You don't know which tools work, and although someone might have something that works in some cases, there's no guarantee that it works for your particular problem. You could spend a substantial amount of time going down the wrong path, like when Wiles abandoned some of his early work in favour of new developments (ones he didn't create, and thus wasn't quite as intimately familiar with), one of which caused a fatal error in his proof.

snarkophilus

2006-Aug-18, 09:51 AM

If two finite sets can be put into one-to-one correspondence, then they can't be put into two-to-one or one-to-two correspondence (unless each is empty :D ).

The empty set! One of my favourite objects, mostly because I love to use the phrase "vacuously true."

The empty set! One of my favourite objects, mostly because I love to use the phrase "vacuously true."

Ken G

2006-Aug-18, 01:48 PM

So part of the issue is that for complicated results, you're never really sure until some time has passed and a number of people have not just looked at the new thing, but have used it and become familiar with it.

OK, I wondered if you were talking about "when is a proof a proof". I think that must be the most embarassing element of mathematics, because the only mathematically correct answer to that is... never! So at the end we find mathematics is a human endeavor after all, and nothing is actually proven but a level of usefulness does emerge, allowing for a close approximation of the concept of "proof" to be applied. But that's not really the way it is supposed to work, so I didn't know if you were going to go there. Still, the point is well taken.

So the result (and I realise it's a poor choice of word, but I think it's still applicable) upon which he is basing his current work is that of integration using Cartesian coordinates. It's not wrong per se, but it's wrong in the context of his problem.

Well, I wouldn't call that "wrong", only "unsuccessful". That only speaks to the rate of development, as opposed to real backward progress. But I agree that the key to new mathematical progress is always new tools, and finding those tools must be the exciting part of mathematics research. And your main point, that new tools can be mistakenly used and it may take some time to figure out what the mistake was, is certainly valid.

OK, I wondered if you were talking about "when is a proof a proof". I think that must be the most embarassing element of mathematics, because the only mathematically correct answer to that is... never! So at the end we find mathematics is a human endeavor after all, and nothing is actually proven but a level of usefulness does emerge, allowing for a close approximation of the concept of "proof" to be applied. But that's not really the way it is supposed to work, so I didn't know if you were going to go there. Still, the point is well taken.

So the result (and I realise it's a poor choice of word, but I think it's still applicable) upon which he is basing his current work is that of integration using Cartesian coordinates. It's not wrong per se, but it's wrong in the context of his problem.

Well, I wouldn't call that "wrong", only "unsuccessful". That only speaks to the rate of development, as opposed to real backward progress. But I agree that the key to new mathematical progress is always new tools, and finding those tools must be the exciting part of mathematics research. And your main point, that new tools can be mistakenly used and it may take some time to figure out what the mistake was, is certainly valid.

Bob

2006-Aug-18, 04:32 PM

When is a proof a proof?

A Russian mathematician named Perelman recently solved a problem in topology called Poincare's conjecture which was over a hundred years old. The proof is very sophisticated and is published in a 473 page book, so reviewing it for correctness is not trivial. An organization called the Clay Institute will award Dr Perelman its lucrative millenium prize after the proof has been public for two years without successful challenge.

A Russian mathematician named Perelman recently solved a problem in topology called Poincare's conjecture which was over a hundred years old. The proof is very sophisticated and is published in a 473 page book, so reviewing it for correctness is not trivial. An organization called the Clay Institute will award Dr Perelman its lucrative millenium prize after the proof has been public for two years without successful challenge.

Ken G

2006-Aug-18, 04:41 PM

That's amazing-- a proof that is assumed correct until shown to be otherwise, just because it is so long! Not a good development for mathematics, I should think. Hopefully someone will use it to inspire a much more elegant treatment. Still, mathematicians are just crazy enough to really work it all the way through and if there are no flaws found in two years, it's probably solid.

jlhredshift

2006-Aug-18, 04:46 PM

Remember what happened to Andrew Wile!

tdvance

2006-Aug-18, 04:50 PM

When is a proof a proof?

A Russian mathematician named Perelman recently solved a problem in topology called Poincare's conjecture which was over a hundred years old. The proof is very sophisticated and is published in a 473 page book, so reviewing it for correctness is not trivial. An organization called the Clay Institute will award Dr Perelman its lucrative millenium prize after the proof has been public for two years without successful challenge.

If they can find him, and if he accepts it. See this (http://www.nytimes.com/2006/08/15/science/15math.html).

Todd

A Russian mathematician named Perelman recently solved a problem in topology called Poincare's conjecture which was over a hundred years old. The proof is very sophisticated and is published in a 473 page book, so reviewing it for correctness is not trivial. An organization called the Clay Institute will award Dr Perelman its lucrative millenium prize after the proof has been public for two years without successful challenge.

If they can find him, and if he accepts it. See this (http://www.nytimes.com/2006/08/15/science/15math.html).

Todd

Bob

2006-Aug-18, 05:54 PM

That's amazing-- a proof that is assumed correct until shown to be otherwise, just because it is so long!

Actually this proof isn't too long. It has been peer reviewed and is considered correct.

There are proofs which are too long, though. The proof that the most efficient way of stacking spheres is in a pyramidal pattern runs to hundreds of pages and includes a complicated computer program. The proof of the 4 color map problem includes the results of a computer program which are too long to be reviewed in a human lifetime.

Actually this proof isn't too long. It has been peer reviewed and is considered correct.

There are proofs which are too long, though. The proof that the most efficient way of stacking spheres is in a pyramidal pattern runs to hundreds of pages and includes a complicated computer program. The proof of the 4 color map problem includes the results of a computer program which are too long to be reviewed in a human lifetime.

Ken G

2006-Aug-18, 07:29 PM

It kinda makes me glad I'm not a mathematician...

snarkophilus

2006-Aug-18, 09:35 PM

When is a proof a proof?

A Russian mathematician named Perelman recently solved a problem in topology called Poincare's conjecture which was over a hundred years old. The proof is very sophisticated and is published in a 473 page book, so reviewing it for correctness is not trivial. An organization called the Clay Institute will award Dr Perelman its lucrative millenium prize after the proof has been public for two years without successful challenge.

Rumour has it that Perelman is going to win the Fields medal this year....

A Russian mathematician named Perelman recently solved a problem in topology called Poincare's conjecture which was over a hundred years old. The proof is very sophisticated and is published in a 473 page book, so reviewing it for correctness is not trivial. An organization called the Clay Institute will award Dr Perelman its lucrative millenium prize after the proof has been public for two years without successful challenge.

Rumour has it that Perelman is going to win the Fields medal this year....

Bob

2006-Aug-22, 06:34 PM

... if he accepts it.

Looks like he's turning it down.

http://www.cnn.com/2006/TECH/science/08/22/math.genius.ap/index.html

Looks like he's turning it down.

http://www.cnn.com/2006/TECH/science/08/22/math.genius.ap/index.html

publiusr

2006-Aug-25, 07:18 PM

Tetaration and the extended cardinals allow for numbers to get really big really fast.

I love Rudy Ruckers book Infinity and the Mind.

It begs to be made into a documentary.

I love Rudy Ruckers book Infinity and the Mind.

It begs to be made into a documentary.

crosscountry

2006-Aug-25, 07:52 PM

You are right. I just assumed that high level and hard were equivalent, which is not necessarily the case. My point was simply to say that the branch with the most definitions is not necessarily the most advanced, which was claimed previously.

I've been thinking about it, and I'm not certain I entirely agree with that statement any more... I do in one sense, but don't in another. For reference:

The most advanced stuff is the stuff we don't know much about, the largely untested areas, in which existing results could still be wrong.

But I'm curious as to why you disagree.

Sorry to have been away so long.

I disagree because it seems to me you are inputting your opinion while trying to state a fact. You and I hold different truths about what advanced it. I guess the definition of advanced lends itself to your idea of cutting edge. To me advancement is things we've already solved and accomplished.

Don't know if that answers your question.:eh:

I've been thinking about it, and I'm not certain I entirely agree with that statement any more... I do in one sense, but don't in another. For reference:

The most advanced stuff is the stuff we don't know much about, the largely untested areas, in which existing results could still be wrong.

But I'm curious as to why you disagree.

Sorry to have been away so long.

I disagree because it seems to me you are inputting your opinion while trying to state a fact. You and I hold different truths about what advanced it. I guess the definition of advanced lends itself to your idea of cutting edge. To me advancement is things we've already solved and accomplished.

Don't know if that answers your question.:eh:

crosscountry

2006-Aug-25, 07:54 PM

It kinda makes me glad I'm not a mathematician...

me too. In my grad physics classes often the professor will say something like "this is where we leave it to the mathematicians. ask them for the proof" none of us ever do.

that usually happens when something interesting is happening and we don't want to get bogged down with 200 years of math to exclude other solutions.

me too. In my grad physics classes often the professor will say something like "this is where we leave it to the mathematicians. ask them for the proof" none of us ever do.

that usually happens when something interesting is happening and we don't want to get bogged down with 200 years of math to exclude other solutions.

snarkophilus

2006-Aug-25, 10:15 PM

I disagree because it seems to me you are inputting your opinion while trying to state a fact. You and I hold different truths about what advanced it. I guess the definition of advanced lends itself to your idea of cutting edge. To me advancement is things we've already solved and accomplished.

Oh, I see! No, clearly this thread is about opinions only... there's no way to actually measure what might be harder (except perhaps in a statistical sense, which still doesn't tell us a lot). However, (I think that) what I said is probably true for the majority of people.

I should have posted a sign: FACT-FREE ZONE. :)

Oh, I see! No, clearly this thread is about opinions only... there's no way to actually measure what might be harder (except perhaps in a statistical sense, which still doesn't tell us a lot). However, (I think that) what I said is probably true for the majority of people.

I should have posted a sign: FACT-FREE ZONE. :)

crosscountry

2006-Aug-26, 05:44 AM

I guess you're right 8D

KrustyDog

2014-Nov-25, 07:53 PM

Advanced Calculus, harmonic analysis, and the number theory.

Nowhere Man

2014-Nov-26, 10:45 PM

Thread necromancy alert, 8+ years.

Fred

Fred

Chuck

2014-Nov-28, 03:37 PM

0.999... = 1.0 theory.

DonM435

2014-Nov-28, 04:34 PM

For me, geometry, algebra, trigonometry and statistics were all logical and obvious, but I hit a wall after that. Calculus and differential equations were more art than science, and required some faith to get anywhere. I suppose that everybody has his or her own point of progress blockage like that.

Solfe

2014-Nov-28, 05:24 PM

0.999... = 1.0 theory.

It's a trap! Run!

It's a trap! Run!

Powered by vBulletin® Version 4.2.3 Copyright © 2019 vBulletin Solutions, Inc. All rights reserved.