Mathematical Logic
- Matt Gregory
- Posts: 1537
- Joined: Tue Jun 07, 2005 11:40 am
- Location: United States
Mathematical Logic
Around here we often say that 1+1=2 is a statement of pure logic, but I'm not so sure that it really is.
When I looked up "addition" in the dictionary, the relevant entry said "the process of adding". "Add" means "to calculate a total". "Total" means "a sum of amounts added or considered together". "Sum" means "a total" or "an arithmetical calculation". None of these are enough to tell us that 1+1=2.
No matter how you define "addition" it could never tell you that 1+1=2 short of listing out every possible combination of numbers that could be added together, so how can it be a statement of pure logic (i.e. a truth based solely on definitions)? There must be something else added to it (no pun intended) to know that 1+1 actually equals 2.
If you didn't know what number 1+1 added up to, and if no one had ever figured it out before (i.e. you didn't have a calculator or anything like that) then it seems to me like you would have to draw a number line to figure it out (assuming that you somehow knew how addition worked in spite of not knowing the most basic sum). Wouldn't this turn it into sort of a spacial problem and wouldn't that make it a problem that depends on empirical evidence or how does that work? It doesn't seem like pure logic to me. 1=1 is pure logic, but 1+1=2 seems to contain a little more than that.
It might be easier to raise this question with a more complicated mathematical expression like the square root of 2. We know the definition of "square root" and we know the definition of "2", but I think it's clear that the mathematical expression "the square root of 2" is something no one will ever know the answer to, since it's an irrational number.
If 1+1 needs only pure logic, then the square root of 2 must only need pure logic, too. But if it needs only pure logic, why can't we find an exact answer?
When I looked up "addition" in the dictionary, the relevant entry said "the process of adding". "Add" means "to calculate a total". "Total" means "a sum of amounts added or considered together". "Sum" means "a total" or "an arithmetical calculation". None of these are enough to tell us that 1+1=2.
No matter how you define "addition" it could never tell you that 1+1=2 short of listing out every possible combination of numbers that could be added together, so how can it be a statement of pure logic (i.e. a truth based solely on definitions)? There must be something else added to it (no pun intended) to know that 1+1 actually equals 2.
If you didn't know what number 1+1 added up to, and if no one had ever figured it out before (i.e. you didn't have a calculator or anything like that) then it seems to me like you would have to draw a number line to figure it out (assuming that you somehow knew how addition worked in spite of not knowing the most basic sum). Wouldn't this turn it into sort of a spacial problem and wouldn't that make it a problem that depends on empirical evidence or how does that work? It doesn't seem like pure logic to me. 1=1 is pure logic, but 1+1=2 seems to contain a little more than that.
It might be easier to raise this question with a more complicated mathematical expression like the square root of 2. We know the definition of "square root" and we know the definition of "2", but I think it's clear that the mathematical expression "the square root of 2" is something no one will ever know the answer to, since it's an irrational number.
If 1+1 needs only pure logic, then the square root of 2 must only need pure logic, too. But if it needs only pure logic, why can't we find an exact answer?
-
- Posts: 2766
- Joined: Mon Sep 17, 2001 8:43 am
- Location: Australia
- Contact:
Re: Mathematical Logic
1+1=2 only by definition, and not for any other reason. It doesn't equal 2 because of the laws of addition.
We know that 1=1, so it follows that something more than 1 wouldn't equal 1, so we call it 2.1=1 is pure logic, but 1+1=2 seems to contain a little more than that.
So the question is really whether a x a = a2 (ie, the square of a), where a is the square root of a2, is logical. This too is by definition. Since it is a definition, it must be logical.If 1+1 needs only pure logic, then the square root of 2 must only need pure logic, too. But if it needs only pure logic, why can't we find an exact answer?
Re: Mathematical Logic
This is a question that Russell and Whitehead tackled in the Principia Mathematica, where they attempted to derive arithmetic from pure logic.Matt Gregory wrote:Around here we often say that 1+1=2 is a statement of pure logic, but I'm not so sure that it really is.
see:
http://en.wikipedia.org/wiki/Whitehead-Russell_axioms
... especially the quote from the book on that page.
I refer you to that, rather than attempting to answer, because the book is fucking huge. The derivation is not simple or obvious. (Or interesting, IMO.)
"Square root of two" is an exact answer. It just can't be represented in decimal notation with infinite precision. However, if you can represent it as a decimal within +/- e, for any given e>0.If 1+1 needs only pure logic, then the square root of 2 must only need pure logic, too. But if it needs only pure logic, why can't we find an exact answer?
Compare that to 1/3. You can't write that out in decimal notation precisely, because it is 0.3333333... with the threes going on forever. However, 1/3 is a precise answer. (For rational numbers, we put a line over the repeating digits, indicating that they go on forever, and that may be considered part of decimal notation.)
-
- Posts: 509
- Joined: Fri Mar 07, 2003 6:22 pm
I think you need set theory in addition to logic to give an answer to that question.No matter how you define "addition" it could never tell you that 1+1=2 short of listing out every possible combination of numbers that could be added together, so how can it be a statement of pure logic (i.e. a truth based solely on definitions)? There must be something else added to it (no pun intended) to know that 1+1 actually equals 2.
But here's one way to go, though there are the Peano axioms noted earlier.
Define the natural numbers to be
0=Ø (note that 0 is a set with zero elements)
1={0} (note that 1 is a set with one elements)
2={0,1} (note that 2 is a set with two elements)
and for n>0, n+1=n u {n}. So, for example,
1+1=1 u {1}={0}u{1}={0,1}=2.
There you have it, truth from definitions.
n+1 is n u {n}, that is a set with n elements "plus" a set of one element, n.
--Brian
- David Quinn
- Posts: 5708
- Joined: Sun Sep 09, 2001 6:56 am
- Location: Australia
- Contact:
I'm not sure that set theory works as a proof, as it assumes the very numerical definitions it seeks to prove.
Kevin's approach goes deeper because it literally defines the numbers from scratch. It says that when a single object is grouped together with another single object, the resulting number of objects shall be defined as "2". The process is purely definitional from start to finish.
-
Kevin's approach goes deeper because it literally defines the numbers from scratch. It says that when a single object is grouped together with another single object, the resulting number of objects shall be defined as "2". The process is purely definitional from start to finish.
-
Defining addition
That's actually what the set theory approach does, as well. It may not be as obvious, because of the notation.DavidQuinn000 wrote:Kevin's approach goes deeper because it literally defines the numbers from scratch. It says that when a single object is grouped together with another single object, the resulting number of objects shall be defined as "2". The process is purely definitional from start to finish.
- Matt Gregory
- Posts: 1537
- Joined: Tue Jun 07, 2005 11:40 am
- Location: United States
Ok. Thanks for the replies, I did read them all but I asked a different question than I really meant to ask. I don't know if I can explain it, but I'm trying to figure out the precise boundary between what is pure logic and what is not. Kind of tough because anything you could think of has both a logical element and an empirical element. I do think I understand the problem better, though. Correct me if I'm wrong, but when you find a mathematical answer, the problem together with its (correct) answer is something that's true by definition, but the process of finding the correct answer is empirical because if we get stuck on some step of the problem, we have to think about it until a workable idea comes to us. That's not logical. It's logical to think about it, but it's not logical to have to sit and wait for an answer. I mean, it's logical to have to sit and wait for an answer when you think of the causes that make the waiting necessary, but that has nothing to do with the logic of the answer because it makes no difference how long it takes to find an answer. An answer that takes 10 years to find is just as logical as an answer that takes 10 seconds.
Does anyone see what I'm trying to say??? :-)
Well, I guess it doesn't matter if anyone can figure out what I'm trying to say. I was thinking that the process of finding an answer might make mathematics non-logical, but that's not right.
Does anyone see what I'm trying to say??? :-)
Well, I guess it doesn't matter if anyone can figure out what I'm trying to say. I was thinking that the process of finding an answer might make mathematics non-logical, but that's not right.
-
- Posts: 3851
- Joined: Fri Jun 03, 2005 4:12 pm
- Location: Flippen-well AUSTRALIA
I think, therefore I think that I am thinking
Maybe you were thinking of something along the lines of Penrose's idea that some thought is non-algorithmic?Matt Gregory wrote:I was thinking that the process of finding an answer might make mathematics non-logical, but that's not right.
Penrose at Amazon.com
(Sorry Dave, I just had to fix that url link - D.R.)
Algorithms
What Penrose proposes is that AI will never be able to match the complexity of the human mind - and he's right, for now. Our mind is only capable of holding one thought in consciouness at a time (thanks to time). Behind this one thought, are a multitude of algorithms firing - for the next thought. What we deem 'intuition' is a rapid computation (algorithm) that takes place where all other thoughts do - the 'unconscious' mind. A quantum computer is a step towards AI, but still a baby step. When we discover how to form neurons, and duplicate their energy processes, we will be close. Consciousness is energy. I believe, that there are multiple energy levels (vibrations) present in the brain, producing what we call 'the mind'. Yeah, there's the alpha, beta, delta, and theta waves - but, moreover, there is also their regulation, the regulation of neurons (a separate domain), and the regulation of synapses (also a separate domain). The ability to regulate the regulations is what needs to be programmed - it is their interaction that creates Consciousness, I believe.
Last edited by sevens on Fri Aug 26, 2005 12:13 pm, edited 2 times in total.
- Matt Gregory
- Posts: 1537
- Joined: Tue Jun 07, 2005 11:40 am
- Location: United States
Re: I think, therefore I think that I am thinking
Just a little rant first: could we make a rule or something that says people have to use URL tags so links don't mess with the wrapping of the page? I think that link should have looked something like this: amazon.comDHodges wrote:http://www.amazon.com/exec/-etc
I'm not familiar with Penrose, but I do think the process that causes particular thoughts to appear in our mind is not something we can call logical. Yet there are aspects that are logical: if we're trying to solve a hard problem then it's logical to wait for the answer to come to us rather than immediately write down an illogical answer. If we're trying to come up with an idea for a painting, it's logical to wait for a good idea to come to us instead of proceeding with no idea or a bad idea that we might already have. It's also logical in the sense that our brains are subject to cause and effect, so it's unreasonable to expect our minds to produce answers instantaneously all the time. I guess I already covered that in the earlier post.DHodges wrote:Maybe you were thinking of something along the lines of Penrose's idea that some thought is non-algorithmic?Matt Gregory wrote:I was thinking that the process of finding an answer might make mathematics non-logical, but that's not right.
So what does that leave as the illogical element of it? I think the issue of relevance has to be the deciding factor. All of the statements I made above about the logic of waiting for an answer are not relevant to the actual answer. The logic of the process of finding an answer is irrelevant to the logic of the answer itself. So I think I'm trying to tackle the issue of relevance and how we can figure out for sure if a logical argument is relevant to the issue or not.
But anyway, if we wanted to try and find the reasons for the necessity of waiting, I think they would be impossible to pin down, so in that sense I think we could call that process "non-algorithmic". If we knew all the causes then it would be algorithmic since we could break it down into steps that could simulate the causes. At least, I think we could. I'm not prepared to prove that statement or anything.
- Matt Gregory
- Posts: 1537
- Joined: Tue Jun 07, 2005 11:40 am
- Location: United States
Re: Algorithms
You mean it will never be able to match to process that occurs in the human brain? Because you said "complexity" but then started talking about the process of the brain, but that doesn't have much to do with it.sevens wrote:What Penrose proposes is that AI will never be able to match the complexity of the human mind
Complexeosities
Matt,
What do you mean, what do I mean? I'm always right! (DAMN IT!)
Why doesn't the complexity of the human brain have anything to do with the topic?
What do you mean, what do I mean? I'm always right! (DAMN IT!)
Why doesn't the complexity of the human brain have anything to do with the topic?
- Matt Gregory
- Posts: 1537
- Joined: Tue Jun 07, 2005 11:40 am
- Location: United States
Re: Complexeosities
No, I meant that the process that creates the AI doesn't have much to do with the amount complexity it can simulate. The problems solvable by modern computer technology can be as complex as you want if you devote enough resources to it.
Misfirings
I was referring to the task of replicating consciousness itself.
<Dual Miscommunication:
Syntax Error>
End Loop
Input <'The Daily Show'>
:)
<Dual Miscommunication:
Syntax Error>
End Loop
Input <'The Daily Show'>
:)
- Matt Gregory
- Posts: 1537
- Joined: Tue Jun 07, 2005 11:40 am
- Location: United States
Formal axiomatic systems
I thought it might make sense to stick this reply in this math thread. The original discussion is over here.
David Hodges wrote:
As I see it, addition was defined when they found the sums for all pairs of single digit numbers, figured out how to carry the one over when adding multiple digits, figured out how to deal with the decimal point when adding real numbers, and figured out how to deal with negative numbers. So, although this description isn't quite complete, a complete description of addition can be arrived at without any formalization.
Say we wanted to extend addition into algebra and solve 2x + 3 + 2x + 1. That just amounts to two ordinary addition problems. Dealing with like terms has nothing to do with addition. That's a completely different rule that defines how addition is to be used. Addition hasn't been changed in any way by its use in a different situation, and I don't think any situation exists where the nature of addition is changed in any way.
Formalization is just a language that really didn't create anything in itself. It's just a way of defining things. Addition wasn't improved when they formalized it or anything. It was already fully defined. If there is any use in formalizing addition, it would have to be its use as a model for formalizing other things. Or to put it another way, as an inspiration for new ideas.
My point is that I don't think pure math was invented when they started formalizing things. I mean it was since that's how "pure math" is defined, but I think the distinction between the two maths is pretty much arbitrary. Addition could have been discovered in a way that was completely divorced from application. I think if reality had anything to do with math, the most it could do is inspire it. There's nothing in reality that says 2 rows of 3 apples each is 6 apples. 6 had to be defined before we could come to that conclusion.
David Hodges wrote:
I honestly think all math is pure math. An axiomatic system was invented when they invented addition. The "formal" part of it is just a technicality.As I see it, there are two main schools of thought in mathematical philosophy:
One school is content to treat mathematics as a purely intellectual exercise, inventing axiomatic systems and proving theorems from them without any concern for the existence of some physical system to which it might correspond. (Abstract algebra arose this way.) This is generally called "pure math", but we could think of it as "academic math," like we talk about academic philosophy. You can create mathematical systems that don't necessarily have any relation to a real physical system (although, curiously, applications for these invented systems are sometimes found).
The other school might be called "applied math" (or, philosophically, it might be called Constructivism). The reason math works is because it was built on observations on consistent ways the world works, and you can abstract general rules from a large number of specific observations.
A caveman might observe that two rows of three apples, or three rows of two apples, always gives me a total of six apples, and from there I can abstract that the mathematically important principle doesn't rely on the nature of apples. Multiplication has more to do with the nature of rows and columns, and doesn't care what is in those rows and columns.
If I am right that numbers, counting and addition were in use before any formal axiomatic system existed - and I'm fairly confident that they were - then applied math is historically prior to pure math. I would argue that it is also philosophically prior, although it is generally taught the other way around - the usual way to teach is to present the general principle, and then show how it is applied (do some problems). But math as historically developed and generally used is an abstraction from reality. That's why it works.
As I see it, addition was defined when they found the sums for all pairs of single digit numbers, figured out how to carry the one over when adding multiple digits, figured out how to deal with the decimal point when adding real numbers, and figured out how to deal with negative numbers. So, although this description isn't quite complete, a complete description of addition can be arrived at without any formalization.
Say we wanted to extend addition into algebra and solve 2x + 3 + 2x + 1. That just amounts to two ordinary addition problems. Dealing with like terms has nothing to do with addition. That's a completely different rule that defines how addition is to be used. Addition hasn't been changed in any way by its use in a different situation, and I don't think any situation exists where the nature of addition is changed in any way.
Formalization is just a language that really didn't create anything in itself. It's just a way of defining things. Addition wasn't improved when they formalized it or anything. It was already fully defined. If there is any use in formalizing addition, it would have to be its use as a model for formalizing other things. Or to put it another way, as an inspiration for new ideas.
My point is that I don't think pure math was invented when they started formalizing things. I mean it was since that's how "pure math" is defined, but I think the distinction between the two maths is pretty much arbitrary. Addition could have been discovered in a way that was completely divorced from application. I think if reality had anything to do with math, the most it could do is inspire it. There's nothing in reality that says 2 rows of 3 apples each is 6 apples. 6 had to be defined before we could come to that conclusion.
- Matt Gregory
- Posts: 1537
- Joined: Tue Jun 07, 2005 11:40 am
- Location: United States
If the numbers are defined as "a transfinite sequence where each element is uniquely identified", then addition could be defined as "starting in the sequence indicated by the first number, traversing the number of elements indicated by the second number and using the resulting element". Something like that. The advantage of a definition like this is that it doesn't depend on a particular type of notation.
- Diebert van Rhijn
- Posts: 6469
- Joined: Fri Jun 03, 2005 4:43 pm
Re: Formal axiomatic systems
Better would be in my opinion to say that a construct like '2x3=6' was arrived at before it became part of our counting habits. A mind first creates categories or sets and then the ideas of specific quantities arise from there. Seems unavoidable to me.Matt Gregory wrote:There's nothing in reality that says 2 rows of 3 apples each is 6 apples. 6 had to be defined before we could come to that conclusion.
Nature itself can be observed as creating lots of repetitive shapes and forms. Our mind uses numbers to deal with those but the number in our mind has to correspondent with the manifestations of nature and this also can and has be tested by scientific methods, to avoid error. Good for our survival too.
In this view numbers are essentially a manifestation of how our duality, the manifold, works, not merely a random tool of our brain that happened to have effects when applied. The symbols used only reflect this, just like A=A reflects.
Maybe. But surely it was discovered or defined as a result of studying the reality we perceive? More likely it was part of an evolutionary process, part of the brain wiring to start adding mentally, and then to realize we are adding and subtracting.Addition could have been discovered in a way that was completely divorced from application
- Matt Gregory
- Posts: 1537
- Joined: Tue Jun 07, 2005 11:40 am
- Location: United States
Sorry, I was using sloppy language. I meant "could have" in a theoretical sense, not in a historical sense. Historically speaking, what you said is almost certainly how it happened.Matt:Addition could have been discovered in a way that was completely divorced from application
Diebert: Maybe. But surely it was discovered or defined as a result of studying the reality we perceive? More likely it was part of an evolutionary process, part of the brain wiring to start adding mentally, and then to realize we are adding and subtracting.
I'm going to start over again.
I think the root of the problem lies in comparison. We use math to compare things. But again, doesn't it seem paradoxical that we can compare the imaginary numbers in our mind and compare two things in physical reality and the comparisons turn out to be equivalent?
I think the only way this could occur is if we are mapping our concepts onto reality rather than deriving the concepts from reality. I can't even conceive of how a concept can be derived directly from reality. Can anyone explain how this could occur?
I think that our minds have the natural ability to compare things and every concept we have is the outcome of a comparison. I can't see any other possibility.
The mind has to be connected to reality by its ability to compare things. I think the senses are a far weaker connection. The fact that they overpower us is no proof that they are our most powerful connection to reality. That's a fact about our weakness for the senses, not about the strength of the senses to connect us to reality.
Re: Formal axiomatic systems
No, you can invent a formal system, ground up, by choosing a bunch of axioms and seeing what you get. The connection to reality (if there is one) is the degree to which your axioms apply to the situation you are modelling.Matt Gregory wrote:I honestly think all math is pure math. An axiomatic system was invented when they invented addition. The "formal" part of it is just a technicality.
Without getting all technical here, check out, for instance, finite group theory. I will give some examples if it will help.
The idea of addition actually gets expanded, in that the operation can be used outside of where it started (the counting numbers).Addition hasn't been changed in any way by its use in a different situation, and I don't think any situation exists where the nature of addition is changed in any way.
The expansion into rational numbers and negative numbers is pretty straightforward and non-controversial. But it becomes less clear (I think) that "addition" is "the same" operation when you start, say, adding numbers with an imaginary component, adding vectors or matrices, or 'adding' elements in a finite group.
In a finite group - say, a modulus group - addition is written as a plus sign with a circle around it, to express that the operation is analogous to addition. (Here I put the operator in quotes rather than in a circle.)
"Addition" is said to exist if there is an additive identity (usually denoted as zero) such that, for any N,
N '+' zero = N
Similarly "multiplication" is said to exist if there is a multiplicitive identity (usually called one) such that, for any N,
N '*' one = N
Is that really the essential nature of addition and multiplication?
My point is that I don't think pure math was invented when they started formalizing things. I mean it was since that's how "pure math" is defined, but I think the distinction between the two maths is pretty much arbitrary.
Yes, I agree. The distinction is arbitrary, a matter of where (and if) you choose to draw a line - as all distinctions are, in the end. A lot of what I am calling "applied" math is actually pretty abstract.
Addition is a natural extension of counting.Addition could have been discovered in a way that was completely divorced from application.
For formal counting, define each counting number after one as
two = successor of (one)
three = successor of (successor of (one) )
... and so on, and of course we come up with a convention for naming each successive number.
(Aside: does this define all the counting numbers? Can I define an infinite number of objects at once? Is this a problem? How can I even talk about an infinity, when I am still defining one, two, three, etc.? )
Three plus two means to take the successor of (successor of (three)). It's a shorthand notation. Similarly, multiplication is an extension of addition ( x*y = adding x things together y times).
A row of six apples can be (mentally or physically) re-arranged to be two rows of three apples, or three twos of two apples.I think if reality had anything to do with math, the most it could do is inspire it. There's nothing in reality that says 2 rows of 3 apples each is 6 apples. 6 had to be defined before we could come to that conclusion.
Six is defined through counting. So yes, that would be defined before the operations of adding and multiplying...
Or would it? We can go through and define any large (but finite) group of counting numbers. But we would expect addition and multiplication to apply to all counting numbers, not just the ones we have explicitly defined.
Counting to one
Let's keep it simple and talk about counting. Say I count my cats and declare, "I have one cat."Matt Gregory wrote:I think the only way this could occur is if we are mapping our concepts onto reality rather than deriving the concepts from reality. I can't even conceive of how a concept can be derived directly from reality. Can anyone explain how this could occur?
I think that our minds have the natural ability to compare things and every concept we have is the outcome of a comparison. I can't see any other possibility.
I must be comparing my sensory experience of a particular group of objects ("things that are mine") to some sort of category in my head: the things that I consider to be cats, or have enough catlike qualities that will qualify that object to be considered a cat.
If I have a dog, hopefully it will not get included in my category of cats.
So, yes, we have to be able to define a 'thing' - an object of sensory experience - and compare it to a concept. Hopefully the category will be un-fuzzy enough that we can agree on what objects are and are not cats (at least enough of the time that the term is useful).
The question is, does the category 'cat' exist before you have any experience of cats? I say no. You learn the general usage of the term, whether or not we will count as cats: housecats, lions, dogs, stuffed cats, pictures of cats, or people who are pretending to be cats. The category must be defined before we can use it, but there may well be things we haven't thought of.
If you show someone a picture of a cat, and say, how many cats are there? Generally they will say one. Strictly speaking, there are none (there is only a picture, not a cat). So usage can be sloppy, or vary depending on the situation.
So, if you see a picture of a cat, do you count that as a cat? It depends on your experience - if pictures of cats are generally counted as cats in that situation, then you will count it as a cat.
The concept is built up from a large number of experiences that refine the concept. A small child might call a dog a cat, because it doesn't know any better. An adult is unlikely to make that mistake. We don't define a bunch of categories in our head and then go look and see if they actually exist; we confront various objects, have experiences, and make categories to deal with what we have experienced.
- Matt Gregory
- Posts: 1537
- Joined: Tue Jun 07, 2005 11:40 am
- Location: United States
Ok, I'm going to try and cut right to the chase:
A "cat" is defined by looking at physical characteristics, right? Fuzzy, warm, has whiskers, catches mice, etc. We take all these and mentally assemble them into what we know as a "cat". We know this matches physical reality of the cat because we can see that a single object possesses all of these characteristics.
My question is, what is the physical characteristic of "1"? If you have one cat, does that cat possess the characteristic of "1"? If you got another cat, so that you would look at them and call them "two cats", would the characteristic of the cat that was "1" now become "2" so as to physically change the cat?
Or say that numbers are not part of the cats, but part of the group that the cats are in (i.e. two cats would be a group that has the characteristic of "2"). We're still left with the characteristic of being in a group. What is the physical change that occurs in the cats between being in a group and not being in a group? If the group is different from the cats, then the group must be some sort of physical object that can exist apart from the cats, but that makes no sense either.
This is why I think numbers can't come from studying physical reality, because there is nowhere in physical reality that these characteristics can be found.
A "cat" is defined by looking at physical characteristics, right? Fuzzy, warm, has whiskers, catches mice, etc. We take all these and mentally assemble them into what we know as a "cat". We know this matches physical reality of the cat because we can see that a single object possesses all of these characteristics.
My question is, what is the physical characteristic of "1"? If you have one cat, does that cat possess the characteristic of "1"? If you got another cat, so that you would look at them and call them "two cats", would the characteristic of the cat that was "1" now become "2" so as to physically change the cat?
Or say that numbers are not part of the cats, but part of the group that the cats are in (i.e. two cats would be a group that has the characteristic of "2"). We're still left with the characteristic of being in a group. What is the physical change that occurs in the cats between being in a group and not being in a group? If the group is different from the cats, then the group must be some sort of physical object that can exist apart from the cats, but that makes no sense either.
This is why I think numbers can't come from studying physical reality, because there is nowhere in physical reality that these characteristics can be found.
I'm not sure I'm seeing what you are getting at....
Yes, number is a property of a group or set, not of an individual element in that set.
Doesn't the idea of "six" arise from these experiences? I mean at a very early age. Isn't that the idea behind the counting things they do on Seseme Street?
Or is there some other place for "six" to come from?
Matt Gregory wrote:Or say that numbers are not part of the cats, but part of the group that the cats are in (i.e. two cats would be a group of 2).
Yes, number is a property of a group or set, not of an individual element in that set.
Putting things in a group or set is a mental activity, not a physical one. There need not be any physical change.We're still left with the characteristic of being in a group. What is the physical change that occurs in the cats between being in a group and not being in a group?
Yes, number is a property of the group. (This is true even of a group of one cat - or zero cats.) The grouping is a mental category, not a physical object.If the group is different from the cats, then the group must be some sort of physical object that can exist apart from the cats, but that makes no sense either.
But surely you can have as experience of seeing a group of six cats, six apples, six dollars?This is why I think numbers can't come from studying physical reality, because there is nowhere in physical reality that these characteristics can be found.
Doesn't the idea of "six" arise from these experiences? I mean at a very early age. Isn't that the idea behind the counting things they do on Seseme Street?
Or is there some other place for "six" to come from?