Discussion:
Scary questions asked by IT teachers on uk.education.schools-it
(too old to reply)
Nigel Dooler
2004-07-28 03:32:21 UTC
Permalink
I have just read two posts on uk.education.schools-it by IT teachers
asking some amazingly simple IT related questions. Makes you think if
they are asking such questions if they should really be teaching such
a subject. I find these people asking such basic questions quite
shocking.
If a floppy disk is formatted does this remove viruses?
Thanks
Liz
How do I change the keyboard to a US one to suit some software? It's
XP, btw.
unknown
2004-07-28 15:31:02 UTC
Permalink
Post by Nigel Dooler
I have just read two posts on uk.education.schools-it by IT teachers
asking some amazingly simple IT related questions. Makes you think if
they are asking such questions if they should really be teaching such
a subject. I find these people asking such basic questions quite
shocking.
It's actually quite normal.

IT in schools is a load of toss (e.g. hardly anything useful for the real
world).

Do more maths instead :-)

T.
Robert de Vincy
2004-07-28 15:55:11 UTC
Permalink
Post by unknown
Post by Nigel Dooler
I have just read two posts on uk.education.schools-it by IT teachers
asking some amazingly simple IT related questions. Makes you think if
they are asking such questions if they should really be teaching such
a subject. I find these people asking such basic questions quite
shocking.
It's actually quite normal.
IT in schools is a load of toss (e.g. hardly anything useful for the real
world).
Do more maths instead :-)
Yep. That's useful for the "real world".
--
BdeV
Adam
2004-07-28 17:55:05 UTC
Permalink
Post by Robert de Vincy
Post by unknown
Post by Nigel Dooler
I have just read two posts on uk.education.schools-it by IT teachers
asking some amazingly simple IT related questions. Makes you think if
they are asking such questions if they should really be teaching such
a subject. I find these people asking such basic questions quite
shocking.
It's actually quite normal.
IT in schools is a load of toss (e.g. hardly anything useful for the real
world).
Do more maths instead :-)
Yep. That's useful for the "real world".
Actually it is.

A familiarity with numbers is essential to modern life, and a mind taught to
think logically helps people learn how to use computers, since computers are
also logical.

adam
Alun Harford
2004-07-28 18:19:17 UTC
Permalink
Post by Adam
Post by Robert de Vincy
Post by unknown
Post by Nigel Dooler
I have just read two posts on uk.education.schools-it by IT teachers
asking some amazingly simple IT related questions. Makes you think if
they are asking such questions if they should really be teaching such
a subject. I find these people asking such basic questions quite
shocking.
It's actually quite normal.
IT in schools is a load of toss (e.g. hardly anything useful for the
real
Post by Robert de Vincy
Post by unknown
world).
Do more maths instead :-)
Yep. That's useful for the "real world".
Actually it is.
A familiarity with numbers is essential to modern life, and a mind taught to
think logically helps people learn how to use computers, since computers are
also logical.
Not when they won't f*****g compile your code they're not :-) :-)

Alun Harford
Robert de Vincy
2004-07-28 18:31:54 UTC
Permalink
Post by Adam
Post by Robert de Vincy
Post by unknown
Do more maths instead :-)
Yep. That's useful for the "real world".
Actually it is.
A familiarity with numbers is essential to modern life,
I would disagree with that. With the implication, anyway, that in our
"modern life" we need familiarity with numbers more and more. Isn't
the hue and cry about innumeracy all about how our "modern life" gives
us computers and calculators and machines that place an extra level
between the common chap-in-the-street and a need to work with actual
numbers?

In other words, we -- now -- have less need to know how to add up or
work out a square-root to a practical precision or whatever because
"modern life" provides devices to do that stuff.

Or if you're talking about a familiarity with numbers that goes beyond
practical arithmetic, then... seriously? Who needs maths to that sort
of level enough to call its acquisition "essential to modern life"?
Post by Adam
and a mind taught to think logically helps people learn how to use
computers, since computers are also logical.
Spoken like a true CS student!

How about the software that people are using? Is that logical by virtue
of the computer's underlying principles? There's a whole layer of
potentially non-logical operating-system-plus-applications sitting there
that has to be learnt for anyone wanting to use a computer.

Learning a whole load of mathematical proofs and formulas and whatnots
will not help you use a word-processor, will not show you how to create
a safe and recoverable back-up routine, and will certainly not tell you
if formatting an infected floppy will remove the virus.
--
BdeV
Matt
2004-07-28 20:35:46 UTC
Permalink
Post by Robert de Vincy
Post by Adam
Post by Robert de Vincy
Post by unknown
Do more maths instead :-)
Yep. That's useful for the "real world".
Actually it is.
A familiarity with numbers is essential to modern life,
I would disagree with that. With the implication, anyway, that in our
"modern life" we need familiarity with numbers more and more. Isn't
the hue and cry about innumeracy all about how our "modern life" gives
us computers and calculators and machines that place an extra level
between the common chap-in-the-street and a need to work with actual
numbers?
Sale now on: 25% off marked price!

0% interest free credit for 9 months*
* 14.5% after this period

1 tin covers approx. 15m²

Adjust timing appropriatly for other powers of microwave.

Prices exclusive of VAT

All prices in USD.

Interest paid at 3.1% PA

And there's utility bills, loans, mortgages, holidays...
Post by Robert de Vincy
In other words, we -- now -- have less need to know how to add up or
work out a square-root to a practical precision or whatever because
"modern life" provides devices to do that stuff.
"adding up" is learnt in primary school. Knowing what to add up is taught
later.

I haven't been taught how to calculate a square root manually after
completing A level Further Maths this year. Nor have I used trig tables,
or multiplied with logarithms. This is what my calculator can do for me.
But I have to know what to put into the calculator.
Post by Robert de Vincy
Or if you're talking about a familiarity with numbers that goes beyond
practical arithmetic, then... seriously?
Are you suggesting that pratical arithmetic isn't necessary?
Post by Robert de Vincy
Who needs maths to that sort
of level enough to call its acquisition "essential to modern life"?
My parents would have been more impressed with their new garage is the
builder had got it square. It's not even trapezoid. And the new patio, I
knew instantly how to calculate the area of paving required, and the
angles necessary to cut some triangles. I drew templates for the paver
there.
Post by Robert de Vincy
Post by Adam
and a mind taught to think logically helps people learn how to use
computers, since computers are also logical.
Spoken like a true CS student!
[The whole post is] Spoken like someone who, when faced with a maths
problem, no matter how small, says "oh, I can't do maths" and gets aways
with it. I think it's appaling, how something so basic to civilisation as
what is *_counting_* is being lost under the "I can't do maths" heading.
I've had people say 'oooh, aren't you clever' when I've given them the
exact money for several purchases at a shop whilst they type it all in a
calculator and find a result some time afterwards.
Post by Robert de Vincy
How about the software that people are using? Is that logical by virtue
of the computer's underlying principles? There's a whole layer of
potentially non-logical operating-system-plus-applications sitting there
that has to be learnt for anyone wanting to use a computer.
Yes, of course, but most software is logical 99% of the time, and when it
isn't you ask for help, in whatever form.
Post by Robert de Vincy
Learning a whole load of mathematical proofs and formulas and whatnots
will not help you use a word-processor, will not show you how to create
a safe and recoverable back-up routine, and will certainly not tell you
if formatting an infected floppy will remove the virus.
If we're discussing people who can't add up (as implied by you above) they
won't be learning proof. GCSE Maths doesn't include reams of proofs to
learn. Of course, there are parts that aren't relevent, but compared to
ICT GCSE there is much more relevence to many more people.
--
Matt


-----= Posted via Newsfeeds.Com, Uncensored Usenet News =-----
http://www.newsfeeds.com - The #1 Newsgroup Service in the World!
-----== Over 100,000 Newsgroups - 19 Different Servers! =-----
Robert de Vincy
2004-07-28 20:57:55 UTC
Permalink
Post by Matt
Post by Robert de Vincy
Post by Adam
Post by Robert de Vincy
Post by unknown
Do more maths instead :-)
Yep. That's useful for the "real world".
Actually it is.
A familiarity with numbers is essential to modern life,
I would disagree with that. With the implication, anyway, that in
our "modern life" we need familiarity with numbers more and more.
Isn't the hue and cry about innumeracy all about how our "modern
life" gives us computers and calculators and machines that place an
extra level between the common chap-in-the-street and a need to work
with actual numbers?
Sale now on: 25% off marked price!
0% interest free credit for 9 months*
* 14.5% after this period
1 tin covers approx. 15m²
Adjust timing appropriatly for other powers of microwave.
Prices exclusive of VAT
All prices in USD.
Interest paid at 3.1% PA
And there's utility bills, loans, mortgages, holidays...
Your point here escapes me.

I am responding to the original suggestion of:
"Do more maths instead :-)"

His/her "more maths" I took to mean "study mathematics at A-level (and
above) rather than take IT or anything else at a similar level."

So, basically, I agree that you need basic arithmetic skills to know
how to work out a percentage -- manually OR on a calculator.
Post by Matt
Post by Robert de Vincy
In other words, we -- now -- have less need to know how to add up or
work out a square-root to a practical precision or whatever because
"modern life" provides devices to do that stuff.
"adding up" is learnt in primary school. Knowing what to add up is
taught later.
I haven't been taught how to calculate a square root manually after
completing A level Further Maths this year. Nor have I used trig
tables, or multiplied with logarithms. This is what my calculator can
do for me. But I have to know what to put into the calculator.
Post by Robert de Vincy
Or if you're talking about a familiarity with numbers that goes
beyond practical arithmetic, then... seriously?
Are you suggesting that pratical arithmetic isn't necessary?
The exact opposite. Didn't you follow this sub-thread from its inception?
Post by Matt
Post by Robert de Vincy
Who needs maths to that sort
of level enough to call its acquisition "essential to modern life"?
My parents would have been more impressed with their new garage is the
builder had got it square. It's not even trapezoid. And the new patio,
I knew instantly how to calculate the area of paving required, and the
angles necessary to cut some triangles. I drew templates for the paver
there.
Oh, so get all the builders to take maths A-level (i.e. "Do more maths")?
Post by Matt
Post by Robert de Vincy
Post by Adam
and a mind taught to think logically helps people learn how to use
computers, since computers are also logical.
Spoken like a true CS student!
[The whole post is] Spoken like someone who, when faced with a maths
problem, no matter how small, says "oh, I can't do maths" and gets
aways with it. I think it's appaling, how something so basic to
civilisation as what is *_counting_* is being lost under the "I can't
do maths" heading. I've had people say 'oooh, aren't you clever' when
I've given them the exact money for several purchases at a shop whilst
they type it all in a calculator and find a result some time
afterwards.
Yes. I can do the same and more (that is, arithmetic that 'average'
people find hard or need a calculator for). But my point is (was) that
if we "Do more maths instead :-)" rather than take other A-levels it is
a panacea for curing the deficits in other areas of life. I disagree
strongly.
Post by Matt
Post by Robert de Vincy
How about the software that people are using? Is that logical by
virtue of the computer's underlying principles? There's a whole
layer of potentially non-logical operating-system-plus-applications
sitting there that has to be learnt for anyone wanting to use a
computer.
Yes, of course, but most software is logical 99% of the time, and when
it isn't you ask for help, in whatever form.
Well, yeah, of course!

But my reply addressed the assumption of Adam that simply because a
computer is logical then its software and operation is logical. I do
try to reply to what the previous post has said with a comment that relates
directly to that point, you know.
Post by Matt
Post by Robert de Vincy
Learning a whole load of mathematical proofs and formulas and
whatnots will not help you use a word-processor, will not show you
how to create a safe and recoverable back-up routine, and will
certainly not tell you if formatting an infected floppy will remove
the virus.
If we're discussing people who can't add up (as implied by you above)
they won't be learning proof. GCSE Maths doesn't include reams of
proofs to learn. Of course, there are parts that aren't relevent, but
compared to ICT GCSE there is much more relevence to many more people.
No, I was not discussing people who can't add up. At least, I wasn't
until now.

My reaction was to the original "Do more maths instead :-)" piece of, um,
advice. Yeah, yeah, there was a smiley there, I know it was sort of
facetious, but it's a symptom of an attitude in this group in which
mathematics is exalted to such a superior status and anyone who dares
question it is attacked (see your reply for a prime example). So sue me
for trying to see things from the other side of bias.

And just in case my point has somehow slipped away (again), I am saying
that "Do more maths instead :-)" as flippant or serious advice in the
context of the original message is, well, wrong. As wrong as anything
could be wrong. Wronger, even.
--
BdeV
Ray Pang
2004-07-28 21:11:57 UTC
Permalink
Post by Robert de Vincy
Post by Matt
Post by Robert de Vincy
Post by Adam
Post by Robert de Vincy
Post by unknown
Do more maths instead :-)
Yep. That's useful for the "real world".
Actually it is.
A familiarity with numbers is essential to modern life,
I would disagree with that. With the implication, anyway, that in
our "modern life" we need familiarity with numbers more and more.
Isn't the hue and cry about innumeracy all about how our "modern
life" gives us computers and calculators and machines that place an
extra level between the common chap-in-the-street and a need to work
with actual numbers?
Sale now on: 25% off marked price!
0% interest free credit for 9 months*
* 14.5% after this period
1 tin covers approx. 15m²
Adjust timing appropriatly for other powers of microwave.
Prices exclusive of VAT
All prices in USD.
Interest paid at 3.1% PA
And there's utility bills, loans, mortgages, holidays...
Your point here escapes me.
"Do more maths instead :-)"
And Matt is responding to:

">A familiarity with numbers is essential to modern life,

I would disagree with that."
Matt
2004-07-28 21:39:50 UTC
Permalink
Post by Ray Pang
Post by Robert de Vincy
Post by Matt
Post by Robert de Vincy
Post by Adam
A familiarity with numbers is essential to modern life,
I would disagree with that.
<snip>
Post by Ray Pang
Post by Robert de Vincy
Post by Matt
Sale now on: 25% off marked price!
<snip>
Post by Ray Pang
Post by Robert de Vincy
Your point here escapes me.
"Do more maths instead :-)"
">A familiarity with numbers is essential to modern life,
I would disagree with that."
Yes, exactly.

AIUI we weren't discussing A-levels at all, rather IT as taught by a
teacher who didn't know that formatting a floppy would remove a virus[1].
Hopefully well below A-level. I was thinking more of year 7/8/9 level.

Text that was cut:

Robert de Vincy
||| In other words, we -- now -- have less need to know how to add up or
||| work out a square-root to a practical precision or whatever because
||| "modern life" provides devices to do that stuff.

Matt
|| Are you suggesting that pratical arithmetic isn't necessary?

R de V
| The exact opposite.

I am tired, perhaps I am interpreting this wrongly.

R de V
| Oh, so get all the builders to take maths A-level (i.e. "Do more maths")?

No. GCSE level. I can't change the perception that it's "OK to be bad at
maths" though, so there's nothing I can do about this. It's a shame this
opinion stuck.

A lot of people in the UK would find a little more maths helpful. I'm not
talking about A-levels here, (is there A level IT anyway?), rather basic
maths suitable for helping them through daily life, such as 'how much
paint should I buy?'.
--
Matt
--
Matt

[1] Except in very, very unusual circumstances, as seen on the thread in
uk.e.s-it


-----= Posted via Newsfeeds.Com, Uncensored Usenet News =-----
http://www.newsfeeds.com - The #1 Newsgroup Service in the World!
-----== Over 100,000 Newsgroups - 19 Different Servers! =-----
Robert de Vincy
2004-07-28 22:01:51 UTC
Permalink
I can't change the perception that it's "OK to be bad at maths" though,
so there's nothing I can do about this. It's a shame this opinion stuck.
And there's the opinion that it's okay to be bad at spelling, too.

Perhaps we only see things from our own corners of interest, since I am
a student of Linguistics. My own gripes about education and the way
that the common-chap-in-the-street sees things are all based in language
matters. To specify a few for example:
1. There are no truly standardized things in English, but spelling is
one that comes closest. And yet, look at the way people don't give
a toss about it. Definately. Alot. Acomodation. Your/you're. There/
their/they're. These are all -- to me -- basic, rudimentary things to
get right. I have to force myself to deliberately spell them wrong
(typos excluded).
2. And then there's the paradoxical attitude that there *IS* a standard
way to use English (syntax, in particular) and that doing it any other
way is somehow "bad grammar". If people could only have some basic
understanding that "bad grammar" is a myth, and there is only the
notion of using the wrong register for the wrong audience/readers.
3. The common idea that reading for pleasure and leisure is not what you
should do, given the choice of that or getting pissed down at the pub
every night.

I hope Ian Ford replies here and says that his gripe is that people don't
know enough about glaciers and what colour Mongolia ought to be on a map!
A lot of people in the UK would find a little more maths helpful. I'm
not talking about A-levels here, (is there A level IT anyway?), rather
basic maths suitable for helping them through daily life, such as 'how
much paint should I buy?'.
Well, I read "Do more maths instead" posted in alt.uk.a-levels to be an
exhortation to take A-level maths rather than A-level IT (or "Computing"
or whatever it's called these days -- it was A-level "Computer Science"
in my time, but then that was before PCs and all).
--
BdeV
Matthew Huntbach
2004-07-29 08:20:04 UTC
Permalink
Post by Robert de Vincy
Well, I read "Do more maths instead" posted in alt.uk.a-levels to be an
exhortation to take A-level maths rather than A-level IT (or "Computing"
or whatever it's called these days -- it was A-level "Computer Science"
in my time, but then that was before PCs and all).
There has never been an A-level called "Computer Science".

There is an A-level called "Computing", which has always been called that.
There is also an A-level called "Information Technology" (although these
days it seems to be called "Information and Communication Technology").
These are two separate subjects, although there is some overlap.

A-level Computing is about how computers work, and A-level IT is about using
computers. Think of it as the difference between learning to drive a car and
learning to be a car mechanic who does basic maintenance of cars.

In that sense, Computer Science is about learning to be an engineer who
designs cars. That is a different thing entirely from being a driver or a
mechanic.

Matthew Huntbach
Robert de Vincy
2004-07-29 08:55:00 UTC
Permalink
Post by Matthew Huntbach
There has never been an A-level called "Computer Science".
Well, that's how we all referred to it in my class back in 1990-1991.

It was definitely *NOT* "IT". We had lessons in programming techniques
and data handling (learning very basic data structures, stacks, lists, etc),
there was stuff on processor architecture (using the 6502 as the basis),
and the coursework was to write a piece of software (with full documentation)
that stored data to a semi-permanent medium (i.e., "load your data off
a disk, don't just use stuff that's being input there-and-then at execution
time") in a high-level language. All using the definitions of terms as
relevant to computers in the late 80s and early 90s, of course, and there
was absolutely no requirement to learn spreadsheets, word-processors, etc.
--
BdeV
Matthew Huntbach
2004-07-29 10:45:05 UTC
Permalink
Post by Robert de Vincy
Post by Matthew Huntbach
There has never been an A-level called "Computer Science".
Well, that's how we all referred to it in my class back in 1990-1991.
Nevertheless, I am sure its proper title then, as now, was "Computing" and
not "Computer Science". I find now that many of my applicants put on their
UCAS form that they are taking A-level "Computer Science", but when you
check up it is actually an A-level called by its examination board
"Computing".
Post by Robert de Vincy
It was definitely *NOT* "IT". We had lessons in programming techniques
and data handling (learning very basic data structures, stacks, lists, etc),
there was stuff on processor architecture (using the 6502 as the basis),
and the coursework was to write a piece of software (with full documentation)
that stored data to a semi-permanent medium (i.e., "load your data off
a disk, don't just use stuff that's being input there-and-then at execution
time") in a high-level language. All using the definitions of terms as
relevant to computers in the late 80s and early 90s, of course, and there
was absolutely no requirement to learn spreadsheets, word-processors, etc.
Yes, the same as now - there is an A-level called "Computing" and it is a
separate A-level from the one called "Information Technology" and the exam
boards all run these two separate A-levels. For my purposes as an admissions
tutor in Computer Science, A-level Computing is ok - students who have taken
it will be familiar with some of the terminology and may even have done a
bit of programming. However, it seems you can pass it without doing very
much programming, and a very large part of it is simply rote-learning
definitions rather than getting into the sort of abstract thinking that is
central to Computer Science. Also the view of computing as given in the
A-level syllabus seems to over-concentrate on hardware issues and to be
rather seriously outdated.

Matthew Huntbach
Timmy
2004-07-29 22:06:05 UTC
Permalink
Post by Matthew Huntbach
Post by Robert de Vincy
Well, I read "Do more maths instead" posted in alt.uk.a-levels to be an
exhortation to take A-level maths rather than A-level IT (or "Computing"
or whatever it's called these days -- it was A-level "Computer Science"
in my time, but then that was before PCs and all).
There has never been an A-level called "Computer Science".
The moment I read this I raised an eyebrow! On my shelf is a book
titled A-Level and AS-Level Computer Science (ISBN 0582057825) 1990,
which strangely appears on an American Web site:

http://www.netstoreusa.com/cbbooks/058/0582057825.shtml


I thought maybe this was IT A-Level dressed as computer science, so I
decided to open the book!

Inside it contains some IT type fluff stuff (Systems Analysis), but
does have some cool computer science stuff, including:

Data structures- Queue, stacks, B-Trees (binary not balanced), linked
lists, arrays.

Maths- binary stuff, boolean algebra.

Processor design, logic gates etc.

Programming- lots of assembler!

OK so there is nothing that goes into anywhere near the level of the
1st year of a CompSci degree but it looks like Computer Science to me!

Interestingly there is only one paragraph on databases!
Ray Pang
2004-07-29 22:51:46 UTC
Permalink
Post by Timmy
Post by Matthew Huntbach
Post by Robert de Vincy
Well, I read "Do more maths instead" posted in alt.uk.a-levels to be an
exhortation to take A-level maths rather than A-level IT (or "Computing"
or whatever it's called these days -- it was A-level "Computer Science"
in my time, but then that was before PCs and all).
There has never been an A-level called "Computer Science".
The moment I read this I raised an eyebrow! On my shelf is a book
titled A-Level and AS-Level Computer Science (ISBN 0582057825) 1990,
http://www.netstoreusa.com/cbbooks/058/0582057825.shtml
I thought maybe this was IT A-Level dressed as computer science, so I
decided to open the book!
Inside it contains some IT type fluff stuff (Systems Analysis), but
Data structures- Queue, stacks, B-Trees (binary not balanced), linked
lists, arrays.
Maths- binary stuff, boolean algebra.
Processor design, logic gates etc.
Programming- lots of assembler!
OK so there is nothing that goes into anywhere near the level of the
1st year of a CompSci degree but it looks like Computer Science to me!
There's a lot of boolean algebra in 1st year Comp Sci IIRC. And logic and
processor design.
Post by Timmy
Interestingly there is only one paragraph on databases!
A paragraph?!
Robert de Vincy
2004-07-30 07:20:29 UTC
Permalink
Post by Timmy
Post by Matthew Huntbach
Post by Robert de Vincy
Well, I read "Do more maths instead" posted in alt.uk.a-levels to be
an exhortation to take A-level maths rather than A-level IT (or
"Computing" or whatever it's called these days -- it was A-level
"Computer Science" in my time, but then that was before PCs and
all).
There has never been an A-level called "Computer Science".
The moment I read this I raised an eyebrow! On my shelf is a book
titled A-Level and AS-Level Computer Science (ISBN 0582057825) 1990,
http://www.netstoreusa.com/cbbooks/058/0582057825.shtml
I thought maybe this was IT A-Level dressed as computer science, so I
decided to open the book!
Inside it contains some IT type fluff stuff (Systems Analysis), but
Data structures- Queue, stacks, B-Trees (binary not balanced), linked
lists, arrays.
Maths- binary stuff, boolean algebra.
Processor design, logic gates etc.
Programming- lots of assembler!
That does indeed sound like the basic outline of what I/we did back
then, yep.
Post by Timmy
OK so there is nothing that goes into anywhere near the level of the
1st year of a CompSci degree but it looks like Computer Science to me!
Interestingly there is only one paragraph on databases!
I don't recall even touching the idea of "databses".
--
BdeV
John Porcella
2004-08-01 11:02:44 UTC
Permalink
Post by Robert de Vincy
I can't change the perception that it's "OK to be bad at maths" though,
so there's nothing I can do about this. It's a shame this opinion stuck.
And there's the opinion that it's okay to be bad at spelling, too.
Which, as I think you would agree, not all of us adhere to. In marking 'A'
level scripts I always ringed incorrect spellings, though, as my team leader
warned, it would make the marking process interminable. Since we had to
award up to four marks for "quality of language", the students who could not
spell or could not be bothered lost very easy marks to gain.

The use of paragraphs seems to have disappeared too in examinations.
Post by Robert de Vincy
Perhaps we only see things from our own corners of interest, since I am
a student of Linguistics. My own gripes about education and the way
that the common-chap-in-the-street sees things are all based in language
1. There are no truly standardized things in English, but spelling is
one that comes closest. And yet, look at the way people don't give
a toss about it. Definately. Alot. Acomodation. Your/you're. There/
their/they're. These are all -- to me -- basic, rudimentary things to
get right. I have to force myself to deliberately spell them wrong
(typos excluded).
Indeed.
--
MESSAGE ENDS.
John Porcella
John Porcella
2004-08-01 10:57:42 UTC
Permalink
Post by Matt
AIUI we weren't discussing A-levels at all, rather IT as taught by a
teacher who didn't know that formatting a floppy would remove a virus[1].
I am not sure that a "quick format" would remove a virus, since only the FAT
is deleted leaving the files on the disk surface.

Don't some viruses hide in the boot sector of a disk? Do all forms of
formatting clean this area up?

I thought that the question was a deceptive one, in that that answer is,
perhaps, not as obvious as might appear at first.
--
MESSAGE ENDS.
John Porcella
Matthew Huntbach
2004-07-29 09:44:41 UTC
Permalink
Post by Robert de Vincy
My reaction was to the original "Do more maths instead :-)" piece of, um,
advice. Yeah, yeah, there was a smiley there, I know it was sort of
facetious, but it's a symptom of an attitude in this group in which
mathematics is exalted to such a superior status and anyone who dares
question it is attacked (see your reply for a prime example). So sue me
for trying to see things from the other side of bias.
I am one of those who pushes the "do more maths" message in this group, but
I do it in the context of this group's purpose to provide practical advice to
those taking A-levels. It is a fact that A-level Maths is in high demand.
There are many degree subjects for which A-level Maths is either essential or
highly desirable. There are few (maybe no) degree subjects for which A-level
Information Technology is essential or highly desirable.

In my case, it's not a matter of adopting some superior attitude. It's a
matter of fact that when I look at the degree results of students and
compare them to their entrance qualifications, I see that on average
students who have A-level Maths do better than students who do not have
A-level Maths. Therefore I am likely to look favourably on those who have
done A-level Maths and disfavourably on those who have not. Unfortunately,
my experience is that many people take on the message "maths is useless" at
the age of 16. It's only when they get to apply to university and find that
they can't do the degrees they want to do that they realise they were wrong.

My subject uses very little of the material from A-level Maths. So I believe
the benefit of doing A-level Maths comes more from its training and testing
in logical ways of thinking than from its actual material.

Matthew Huntbach
Alun Harford
2004-07-28 21:37:47 UTC
Permalink
Post by Matt
I haven't been taught how to calculate a square root manually after
completing A level Further Maths this year.
I'd be very worried if you couldn't figure out a way.
To go back to A-level maths (P2 or P3 iirc):
(a+b)^(1/2)=...
SQRT(101) = (100+1)^(1/2) = ...

I don't know if you did P6 but there's another (more useful) method there.

Or better yet - just leave it as root whatever and don't bother (unless
that's the whole point of the exercise)
Post by Matt
Nor have I used trig tables,
Got to admit - neither have I.
Post by Matt
or multiplied with logarithms.
Useful when numbers get large. Can't you think of a number your calculator
won't be happy with :-)
Multiply ((10^10)^10)! by ((6^6)^6)!
Get back to me when your calculator has finished!
Post by Matt
But I have to know what to put into the calculator.
Personally, my Further Maths taught me to never bother with a calculator.
If it's worth bothering to solve, solve it in the general case.
If it's too complex, get a proper computer out because your pocket
calculator won't do very well.

They're nice if you need to multiply lots of pairs of 5 digit numbers
together - otherwise they're just not worth the hastle.

Alun Harford
Matthew Huntbach
2004-07-29 08:09:09 UTC
Permalink
Post by Matt
Post by Robert de Vincy
Post by Adam
A familiarity with numbers is essential to modern life,
I would disagree with that. With the implication, anyway, that in our
"modern life" we need familiarity with numbers more and more. Isn't
the hue and cry about innumeracy all about how our "modern life" gives
us computers and calculators and machines that place an extra level
between the common chap-in-the-street and a need to work with actual
numbers?
Sale now on: 25% off marked price!
etc.

My favourite example is the line

"They tell us inflation is coming down, so how come prices are still going
up?"

A familiarity with calculus would help get people to understand the
difference between the rate of change and the rate of change of the rate of
change.

However, a simple way to explain it is "Look, if your car is decelarating,
it's still moving forwards".

Matthew Huntbach
Toby
2004-07-29 11:00:38 UTC
Permalink
Post by Matthew Huntbach
Post by Matt
Post by Robert de Vincy
Post by Adam
A familiarity with numbers is essential to modern life,
I would disagree with that. With the implication, anyway, that in our
"modern life" we need familiarity with numbers more and more. Isn't
the hue and cry about innumeracy all about how our "modern life" gives
us computers and calculators and machines that place an extra level
between the common chap-in-the-street and a need to work with actual
numbers?
Sale now on: 25% off marked price!
etc.
My favourite example is the line
"They tell us inflation is coming down, so how come prices are still going
up?"
A familiarity with calculus would help get people to understand the
difference between the rate of change and the rate of change of the rate of
change.
However, a simple way to explain it is "Look, if your car is decelarating,
it's still moving forwards".
Matthew Huntbach
So prices are incresaing but not by as much etc. etc.?
Ray Pang
2004-07-29 17:09:19 UTC
Permalink
Post by Toby
Post by Matthew Huntbach
Post by Matt
Post by Robert de Vincy
Post by Adam
A familiarity with numbers is essential to modern life,
I would disagree with that. With the implication, anyway, that in our
"modern life" we need familiarity with numbers more and more. Isn't
the hue and cry about innumeracy all about how our "modern life" gives
us computers and calculators and machines that place an extra level
between the common chap-in-the-street and a need to work with actual
numbers?
Sale now on: 25% off marked price!
etc.
My favourite example is the line
"They tell us inflation is coming down, so how come prices are still going
up?"
A familiarity with calculus would help get people to understand the
difference between the rate of change and the rate of change of the rate of
change.
However, a simple way to explain it is "Look, if your car is decelarating,
it's still moving forwards".
Matthew Huntbach
So prices are incresaing but not by as much etc. etc.?
House prices are still rising, but not as quickly, although the rate at
which they are slowing down is decreasing, which means that the increase in
prices will become a faster increase. This is where maths steps in. Replace
all this by x, d/dx, d2/dx2, d3/dx3 and it's all a lot clearer.
Toby
2004-07-29 17:32:30 UTC
Permalink
Post by Ray Pang
Post by Toby
Post by Matthew Huntbach
Post by Matt
Post by Robert de Vincy
Post by Adam
A familiarity with numbers is essential to modern life,
I would disagree with that. With the implication, anyway, that in our
"modern life" we need familiarity with numbers more and more. Isn't
the hue and cry about innumeracy all about how our "modern life" gives
us computers and calculators and machines that place an extra level
between the common chap-in-the-street and a need to work with actual
numbers?
Sale now on: 25% off marked price!
etc.
My favourite example is the line
"They tell us inflation is coming down, so how come prices are still going
up?"
A familiarity with calculus would help get people to understand the
difference between the rate of change and the rate of change of the rate of
change.
However, a simple way to explain it is "Look, if your car is decelarating,
it's still moving forwards".
Matthew Huntbach
So prices are incresaing but not by as much etc. etc.?
House prices are still rising, but not as quickly, although the rate at
which they are slowing down is decreasing, which means that the increase in
prices will become a faster increase. This is where maths steps in. Replace
all this by x, d/dx, d2/dx2, d3/dx3 and it's all a lot clearer.
hehe no I'm with you, I've never had a problem with transferring
language into maths straight away (though for some reason I always
hated f(x) at the start...)
John Porcella
2004-08-02 11:56:55 UTC
Permalink
Post by Toby
Post by Ray Pang
House prices are still rising, but not as quickly, although the rate at
which they are slowing down is decreasing, which means that the increase in
prices will become a faster increase. This is where maths steps in. Replace
all this by x, d/dx, d2/dx2, d3/dx3 and it's all a lot clearer.
hehe no I'm with you, I've never had a problem with transferring
language into maths straight away (though for some reason I always
hated f(x) at the start...)
Good, but this can only be done successfully if you understand the language
to begin with. That you had to ask for confirmation indicates some
hesitancy and/or lack of certainty, perhaps? Without that certainty, the
translation into 'math-speak' would be tricky.
--
MESSAGE ENDS.
John Porcella
Toby
2004-08-02 12:36:16 UTC
Permalink
On Mon, 2 Aug 2004 11:56:55 +0000 (UTC), "John Porcella"
Post by Ray Pang
Post by Toby
Post by Ray Pang
House prices are still rising, but not as quickly, although the rate at
which they are slowing down is decreasing, which means that the increase
in
Post by Toby
Post by Ray Pang
prices will become a faster increase. This is where maths steps in.
Replace
Post by Toby
Post by Ray Pang
all this by x, d/dx, d2/dx2, d3/dx3 and it's all a lot clearer.
hehe no I'm with you, I've never had a problem with transferring
language into maths straight away (though for some reason I always
hated f(x) at the start...)
Good, but this can only be done successfully if you understand the language
to begin with. That you had to ask for confirmation indicates some
hesitancy and/or lack of certainty, perhaps? Without that certainty, the
translation into 'math-speak' would be tricky.
I just wanted confirmation.
John Porcella
2004-08-02 11:54:15 UTC
Permalink
Post by Toby
Post by Matthew Huntbach
Matthew Huntbach
So prices are incresaing but not by as much etc. etc.?
Yes.
--
MESSAGE ENDS.
John Porcella
John Porcella
2004-08-02 11:53:13 UTC
Permalink
Post by Matthew Huntbach
Post by Matt
Post by Robert de Vincy
Post by Adam
A familiarity with numbers is essential to modern life,
I would disagree with that. With the implication, anyway, that in our
"modern life" we need familiarity with numbers more and more. Isn't
the hue and cry about innumeracy all about how our "modern life" gives
us computers and calculators and machines that place an extra level
between the common chap-in-the-street and a need to work with actual
numbers?
Sale now on: 25% off marked price!
etc.
My favourite example is the line
"They tell us inflation is coming down, so how come prices are still going
up?"
A familiarity with calculus would help get people to understand the
difference between the rate of change and the rate of change of the rate of
change.
Prices going up is an absolute change in price, but a fall in inflation is
a change in the relative growth.
Post by Matthew Huntbach
However, a simple way to explain it is "Look, if your car is decelarating,
it's still moving forwards".
Excellent example.

I can confirm from marking many hundreds of A level scripts that the vast
majority of students would write that falling inflation meant that prices
were falling/getting cheaper, when they should have written that they are
rising at a slower rate.
--
MESSAGE ENDS.
John Porcella
Ray Pang
2004-07-28 21:01:15 UTC
Permalink
Post by Robert de Vincy
Post by Adam
Post by Robert de Vincy
Post by unknown
Do more maths instead :-)
Yep. That's useful for the "real world".
Actually it is.
A familiarity with numbers is essential to modern life,
I would disagree with that. With the implication, anyway, that in our
"modern life" we need familiarity with numbers more and more.
Maybe not essential in a rudimentary "Me Tarzan, You Jane" sense, but when
you apply for your mortgage, if they told you that you'd be paying 4% flat
interest, would you know? What about compound interest, and the APR
explanation? What about when Britain changes to kilometres? You might learn
that you divide miles by 5 then multiply by 8, but when you're driving and
you see an 80kmh sign, would you instantly recognise that that's close
enough to 50mph?
Post by Robert de Vincy
Isn't
the hue and cry about innumeracy all about how our "modern life" gives
us computers and calculators and machines that place an extra level
between the common chap-in-the-street and a need to work with actual
numbers?
No, because there are situations when a calculator isn't practical, like the
one I mentioned above. Much of the broo-hah about the change to the Euro and
metric system was because people couldn't divide by "2 and a bit" to convert
from lb to kg, for instance. People were completely baffled as to what a
kilo of spuds was.
Post by Robert de Vincy
In other words, we -- now -- have less need to know how to add up or
work out a square-root to a practical precision or whatever because
"modern life" provides devices to do that stuff.
If you were buying a carpet for your room, you'd be able to get a quick
guess whether that nice carpet going cheap in a "last in stock" offer would
fit your room or not. Again, not absolutely vital, but handy nonetheless.
Post by Robert de Vincy
Or if you're talking about a familiarity with numbers that goes beyond
practical arithmetic, then... seriously? Who needs maths to that sort
of level enough to call its acquisition "essential to modern life"?
Post by Adam
and a mind taught to think logically helps people learn how to use
computers, since computers are also logical.
Spoken like a true CS student!
How about the software that people are using? Is that logical by virtue
of the computer's underlying principles?
It behaves precisely how thje programming code dictates it to run on a
specific machine under the specific conditions. It's as logical as the
universe. Everything (should) make sense.
Post by Robert de Vincy
There's a whole layer of
potentially non-logical operating-system-plus-applications sitting there
that has to be learnt for anyone wanting to use a computer.
Logical from whose/what point of view? From a mathematician's point of view,
it's all logical.
Post by Robert de Vincy
Learning a whole load of mathematical proofs and formulas and whatnots
will not help you use a word-processor, will not show you how to create
a safe and recoverable back-up routine, and will certainly not tell you
if formatting an infected floppy will remove the virus.
Having the sort of brain that a mathematician should have will make you
attack each of these problems in a logical manner, rather than flapping your
hands in the air when your PC tells you that your floppy disk is infected.
Robert de Vincy
2004-07-28 21:28:28 UTC
Permalink
Post by Ray Pang
Post by Robert de Vincy
I would disagree with that. With the implication, anyway, that in
our "modern life" we need familiarity with numbers more and more.
Maybe not essential in a rudimentary "Me Tarzan, You Jane" sense, but
when you apply for your mortgage, if they told you that you'd be
paying 4% flat interest, would you know? What about compound interest,
and the APR explanation? What about when Britain changes to
kilometres? You might learn that you divide miles by 5 then multiply
by 8, but when you're driving and you see an 80kmh sign, would you
instantly recognise that that's close enough to 50mph?
Take my sentence above and read it with the rest of the paragraph it
Post by Ray Pang
Post by Robert de Vincy
Isn't the hue and cry about innumeracy all about how our "modern life"
gives us computers and calculators and machines that place an extra
level between the common chap-in-the-street and a need to work with
actual numbers?
I'm disagreeing with Adam's "A familiarity with numbers is essential to modern
life" statement. Perhaps I'm reading more implied meaning into it than
intended and if so then I'm mistaken and I'll need to rethink what I
originally wrote.
But if you take my reply in the context of my believing that Adam was
saying that we, today, 21st-Century people, need more familiarity with
numbers as compared to previous generations, then... you see? To get by
in "modern life" we don't need the level of hands-on work-it-out-with-a-
pencil-and-paper maths skills that, say, our grandfathers needed.
(And I'm not saying this is a good thing or a bad thing -- it's just a
thing.)
Post by Ray Pang
No, because there are situations when a calculator isn't practical,
like the one I mentioned above. Much of the broo-hah about the change
to the Euro and metric system was because people couldn't divide by "2
and a bit" to convert from lb to kg, for instance. People were
completely baffled as to what a kilo of spuds was.
Post by Robert de Vincy
In other words, we -- now -- have less need to know how to add up or
work out a square-root to a practical precision or whatever because
"modern life" provides devices to do that stuff.
If you were buying a carpet for your room, you'd be able to get a
quick guess whether that nice carpet going cheap in a "last in stock"
offer would fit your room or not. Again, not absolutely vital, but
handy nonetheless.
Exactly. But "Do more maths" would help with that? Nah. Firstly, the
people would actually want to "Do more maths" would already know how to
work out areas (I hope!) so further instruction would be pointless. And,
secondly, the people who would benefit from learning how to calculate
carpet/floor areas on-the-fly would not be the ones we should be putting
onto A-level courses until they had already learnt those basics.
Post by Ray Pang
Post by Robert de Vincy
Or if you're talking about a familiarity with numbers that goes
beyond practical arithmetic, then... seriously? Who needs maths to
that sort of level enough to call its acquisition "essential to
modern life"?
Post by Adam
and a mind taught to think logically helps people learn how to use
computers, since computers are also logical.
Spoken like a true CS student!
How about the software that people are using? Is that logical by
virtue of the computer's underlying principles?
It behaves precisely how thje programming code dictates it to run on a
specific machine under the specific conditions. It's as logical as the
universe. Everything (should) make sense.
Should, but doesn't. Why else do we have "Windows sucks!" mentalities
being perpetuated? If Windows was such a perfectly logical system then
it would work. First time, every time.
Post by Ray Pang
Post by Robert de Vincy
There's a whole layer of potentially non-logical operating-system-plus-
applications sitting there that has to be learnt for anyone wanting to
use a computer.
Logical from whose/what point of view? From a mathematician's point of
view, it's all logical.
Once again, this is presuming that because the underlying "mechanics" are
logical in operation then anything created on them will automatically be
logical. What about the human factor? I could go away now and write a
Delphi application that would be highly illogical and very difficult to
operate. How could the underlying computer's principles of logic stop me
from incorporating illogical features? (I'm not saying that programmers
do this deliberately in commercial applications, of course! But what can
be done here deliberately can also creep in accidentally.)
Post by Ray Pang
Post by Robert de Vincy
Learning a whole load of mathematical proofs and formulas and
whatnots will not help you use a word-processor, will not show you
how to create a safe and recoverable back-up routine, and will
certainly not tell you if formatting an infected floppy will remove
the virus.
Having the sort of brain that a mathematician should have will make
you attack each of these problems in a logical manner, rather than
flapping your hands in the air when your PC tells you that your floppy
disk is infected.
And specific knowledge about how viruses work and how floppy disks store
their data would not be needed?

Unless you're saying that "a logical manner" includes doing all the
research to gather all the relevant information and then using that to
solve the problem.
--
BdeV
Ray Pang
2004-07-28 23:06:46 UTC
Permalink
Post by Robert de Vincy
But if you take my reply in the context of my believing that Adam was
saying that we, today, 21st-Century people, need more familiarity with
numbers as compared to previous generations, then... you see? To get by
in "modern life" we don't need the level of hands-on work-it-out-with-a-
pencil-and-paper maths skills that, say, our grandfathers needed.
(And I'm not saying this is a good thing or a bad thing -- it's just a
thing.)
Absolutely. We need the level of being able to estimate things due to an
understanding of the underlying maths. So if I said I need to buy 41 books
at £39, then I'd immediately be able to recognise that I'd be spending about
£1600. Much in the same way as that VAT is 17.5%, so on a £103 purchase, I'm
paying about £18. The finer details can of course be looked up and/or
calculated electronically or by pen-and-paper, but it's an *understanding*
of maths which is necessary, rather than the mechanical processes.
Post by Robert de Vincy
Post by Ray Pang
No, because there are situations when a calculator isn't practical,
like the one I mentioned above. Much of the broo-hah about the change
to the Euro and metric system was because people couldn't divide by "2
and a bit" to convert from lb to kg, for instance. People were
completely baffled as to what a kilo of spuds was.
Post by Robert de Vincy
In other words, we -- now -- have less need to know how to add up or
work out a square-root to a practical precision or whatever because
"modern life" provides devices to do that stuff.
If you were buying a carpet for your room, you'd be able to get a
quick guess whether that nice carpet going cheap in a "last in stock"
offer would fit your room or not. Again, not absolutely vital, but
handy nonetheless.
Exactly. But "Do more maths" would help with that? Nah. Firstly, the
people would actually want to "Do more maths" would already know how to
work out areas (I hope!) so further instruction would be pointless. And,
secondly, the people who would benefit from learning how to calculate
carpet/floor areas on-the-fly would not be the ones we should be putting
onto A-level courses until they had already learnt those basics.
Post by Ray Pang
Post by Robert de Vincy
Or if you're talking about a familiarity with numbers that goes
beyond practical arithmetic, then... seriously? Who needs maths to
that sort of level enough to call its acquisition "essential to
modern life"?
Post by Adam
and a mind taught to think logically helps people learn how to use
computers, since computers are also logical.
Spoken like a true CS student!
How about the software that people are using? Is that logical by
virtue of the computer's underlying principles?
It behaves precisely how thje programming code dictates it to run on a
specific machine under the specific conditions. It's as logical as the
universe. Everything (should) make sense.
Should, but doesn't. Why else do we have "Windows sucks!" mentalities
being perpetuated? If Windows was such a perfectly logical system then
it would work. First time, every time.
Windows is perfectly logical in the sense that the code is written with
mistakes in it, and thus it doesn't perform how the writer intended it to
perform. A computer (assuming no hardware flaw, no cosmic rays or whatever,
all of which just make the logical system more complex)does at it is told
to.
Post by Robert de Vincy
Post by Ray Pang
Post by Robert de Vincy
There's a whole layer of potentially non-logical operating-system-plus-
applications sitting there that has to be learnt for anyone wanting to
use a computer.
Logical from whose/what point of view? From a mathematician's point of
view, it's all logical.
Once again, this is presuming that because the underlying "mechanics" are
logical in operation then anything created on them will automatically be
logical. What about the human factor? I could go away now and write a
Delphi application that would be highly illogical and very difficult to
operate. How could the underlying computer's principles of logic stop me
from incorporating illogical features?
Which is precisely my point. It depends on what exactly you mean to be
logical. Who is to say that what you imagine to be logical is completely
unfathomable by somebody else. All I was pointing out was that computer
programs are logical in the sense that they do what the programmer
specifically made it do.
Post by Robert de Vincy
Post by Ray Pang
Post by Robert de Vincy
Learning a whole load of mathematical proofs and formulas and
whatnots will not help you use a word-processor, will not show you
how to create a safe and recoverable back-up routine, and will
certainly not tell you if formatting an infected floppy will remove
the virus.
Having the sort of brain that a mathematician should have will make
you attack each of these problems in a logical manner, rather than
flapping your hands in the air when your PC tells you that your floppy
disk is infected.
And specific knowledge about how viruses work and how floppy disks store
their data would not be needed?
Obviously, if you'd been taught how to deal with viruses then you're
laughing, but a logical mind would think, hang on, this Norton Antivirus
thing sitting in the corner must be related. Let's give it a try. Hmm, that
"Scan my computer for viruses button" seems to be quite sensible. An
illogical mind would think, shit, I have a virus, end of the world. Help.
Post by Robert de Vincy
Unless you're saying that "a logical manner" includes doing all the
research to gather all the relevant information and then using that to
solve the problem.
Yup. That's what I was saying.
Robert de Vincy
2004-07-28 23:49:50 UTC
Permalink
Ray Pang did write:

[snips]
Post by Ray Pang
Windows is perfectly logical in the sense that the code is written
with mistakes in it, and thus it doesn't perform how the writer
intended it to perform. A computer (assuming no hardware flaw, no
cosmic rays or whatever, all of which just make the logical system
more complex)does at it is told to.
[...]
Post by Ray Pang
Which is precisely my point. It depends on what exactly you mean to be
logical. Who is to say that what you imagine to be logical is
completely unfathomable by somebody else. All I was pointing out was
that computer programs are logical in the sense that they do what the
programmer specifically made it do.
But that's just playing pick-n-choose with definitions to fit some purpose
outside of the actual context.

When I say that a program ought to be logical, whom do you think I am
really framing my definition of "logical" by? The machine or the person
who wants to use the machine? For me (someone who once revelled in all
that machine-based geekery but has now turned his back on it and returned
his attention to human beings) the user of the computer -- of any tool,
when it comes to that -- is the element of prime importance and the one
for whom the whole interaction with that tool ought to take precedence.

Is this a Science vs. Arts thing?
--
BdeV
Ray Pang
2004-07-29 07:10:51 UTC
Permalink
Post by Robert de Vincy
[snips]
Post by Ray Pang
Windows is perfectly logical in the sense that the code is written
with mistakes in it, and thus it doesn't perform how the writer
intended it to perform. A computer (assuming no hardware flaw, no
cosmic rays or whatever, all of which just make the logical system
more complex)does at it is told to.
[...]
Post by Ray Pang
Which is precisely my point. It depends on what exactly you mean to be
logical. Who is to say that what you imagine to be logical is
completely unfathomable by somebody else. All I was pointing out was
that computer programs are logical in the sense that they do what the
programmer specifically made it do.
But that's just playing pick-n-choose with definitions to fit some purpose
outside of the actual context.
When I say that a program ought to be logical, whom do you think I am
really framing my definition of "logical" by? The machine or the person
who wants to use the machine? For me (someone who once revelled in all
that machine-based geekery but has now turned his back on it and returned
his attention to human beings) the user of the computer -- of any tool,
when it comes to that -- is the element of prime importance and the one
for whom the whole interaction with that tool ought to take precedence.
OK fair enough, but there are still differences as to what is logical. There
are people out there who can't fathom unix at all, thinking that it's
completely illogical, whereas others swear by it. Peoples' minds work in
different ways, so no matter how logical you think, say, your program's
behaviour is, somebody else might expect it to behave very differently.
Post by Robert de Vincy
Is this a Science vs. Arts thing?
No. What's the point of that?
Matthew Huntbach
2004-07-29 09:24:42 UTC
Permalink
Post by Robert de Vincy
Post by Ray Pang
Post by Robert de Vincy
How about the software that people are using? Is that logical by
virtue of the computer's underlying principles?
It behaves precisely how thje programming code dictates it to run on a
specific machine under the specific conditions. It's as logical as the
universe. Everything (should) make sense.
Should, but doesn't. Why else do we have "Windows sucks!" mentalities
being perpetuated? If Windows was such a perfectly logical system then
it would work. First time, every time.
You are using the word "logical" in a different sense from the way Ray is
using it. Ray is using it to mean "has a fixed pattern of behaviour which
can be analysed on a step by step basis". You are using it to mean "works in
a way that humans find easy to understand".

If Windows truly were "illogical" in Ray's sense, it would have a litle
goblin sitting inside it which decides to do different things on a whim. It
doesn't. If it "doesn't work" according to the human specification it
doesn't mean a goblin has decided to act strangely. Rather it means the
programmer hasn't taken account of some unusual pattern of circumstances.
What has gone "wrong" in the sense of it not behaving as you would expect
can be analysed logically by looking at the code, and a patch in the shape
of modified code can be issued by Microsoft which ensures that if those same
circumstances are net again, it behaves according to the specification.
Post by Robert de Vincy
Post by Ray Pang
Post by Robert de Vincy
There's a whole layer of potentially non-logical operating-system-plus-
applications sitting there that has to be learnt for anyone wanting to
use a computer.
Logical from whose/what point of view? From a mathematician's point of
view, it's all logical.
Once again, this is presuming that because the underlying "mechanics" are
logical in operation then anything created on them will automatically be
logical. What about the human factor? I could go away now and write a
Delphi application that would be highly illogical and very difficult to
operate. How could the underlying computer's principles of logic stop me
from incorporating illogical features?
Code in Delphi does not rely on a temperamental goblin. Therefore, in Ray's
sense it's logical, even though in your sense it's not. It would be perfectly
possible, for example, to write a piece of code which meant your Delphi
application behaved differently on a Thursday afternoon to any other time.
That would be highly illogical in your sense, but logical in Ray's sense in
that one could look at the code and see it behaved precisely as programmed.
In the case of Windows, no-one would write code that was deliberately
"illogical" in your sense like this. However, consider a piece of code that
behaves differently if some store location is used up than if it is only
partially used. Illogical in your sense, logical in Ray's. The programmer
who wrote that code either assumed that store location would never necome
completely full or forgot to put in extra code that would deal with the
store location becoming full and doing whatever work would be necessary to
mean that didn't affect what the user of the system saw. It's *because*
Windows is logical that it's possible to find this store location problem
and add extra code to deal with it.

Matthew Huntbach
Adam
2004-07-30 11:09:52 UTC
Permalink
Post by Robert de Vincy
Post by Adam
Post by Robert de Vincy
Post by unknown
Do more maths instead :-)
Yep. That's useful for the "real world".
Actually it is.
A familiarity with numbers is essential to modern life,
I would disagree with that. With the implication, anyway, that in our
"modern life" we need familiarity with numbers more and more. Isn't
the hue and cry about innumeracy all about how our "modern life" gives
us computers and calculators and machines that place an extra level
between the common chap-in-the-street and a need to work with actual
numbers?
In other words, we -- now -- have less need to know how to add up or
work out a square-root to a practical precision or whatever because
"modern life" provides devices to do that stuff.
Or if you're talking about a familiarity with numbers that goes beyond
practical arithmetic, then... seriously? Who needs maths to that sort
of level enough to call its acquisition "essential to modern life"?
Okay, maybe 'essential' was a bit strong, I am very slow at adding up, even
worse at subtracting, if someone asked me "seven plus six" i'd have to do
"six plus six plus one" in my head.

But, and this I suppose is my overall point about maths, it teaches you how
to approach problems, break them down into smaller things which you *can*
do, and the problem becomes easier.
Post by Robert de Vincy
Post by Adam
and a mind taught to think logically helps people learn how to use
computers, since computers are also logical.
Spoken like a true CS student!
How about the software that people are using? Is that logical by virtue
of the computer's underlying principles? There's a whole layer of
potentially non-logical operating-system-plus-applications sitting there
that has to be learnt for anyone wanting to use a computer.
Learning a whole load of mathematical proofs and formulas and whatnots
will not help you use a word-processor, will not show you how to create
a safe and recoverable back-up routine, and will certainly not tell you
if formatting an infected floppy will remove the virus.
I'm gonna latch on to the "safe and recoverable back-up routine" cos I think
that being good at maths and being good at devising such routines *are*
linked (tho, of course, there will be exceptions ... but what I'm saying is
it's easier to teach people to think via maths, than via studying classic
literature and devising arguments and writing essays, even tho both do the
same sortof thing).

My reason is, when doing maths, you can't 'fudge' things ever, you can't say
"oh, well, it's nearly right", you have to make sure that everything you
have done, every step you take, is accurate.

With GCSE and A-level maths there are usually only a few steps and I reckon
most people can grasp that and I reckon it does help with general life.

Like, doing a backup thing, you need to make sure that all the data you need
is stored, stored somewhere seperate to the original, and stored in such a
manner that it can be recovered.


Contrast this with the kind of thought processes which go on in humanties
subjects. Obviously, if you are doing it properly, it will be similar. I'm
no expert (I've written 2 essays this year and both were utter shite) but
when putting forward a position you have to make sure that it is logical,
one point follows on from the other, make sure there are no 'holes' in your
argument, and that any possible counter-arguments are dealt with.

But when it is taught in school, it is possible to do a great deal wrong and
still get an okay mark.

With maths, if you do a great deal wrong, it's wrong.

And it's easier to show where things have gone wrong.

Hence, it's easier to teach and learn from maths.


Obviously, by far the best way to learn anything is through direct
experience with what you are trying to learn. But, if you have an extra
slot in your timetable, then taking more maths will be more useful than
taking media studies, and if you have no preference (i.e. don't *like*
english more than maths, and aren't necessilary better at one or the other)
then maths will be more useful than English.

adam

(ps - i have an open mind on this, i've just written my current thinking but
i'd be really interested to hear your thoughts cos you seem like a clever
chap with the opposite view, and i like clever chaps with opposite views)
Robert de Vincy
2004-07-30 13:02:49 UTC
Permalink
Adam did write:

[...]
Post by Adam
But, and this I suppose is my overall point about maths, it teaches you
how to approach problems, break them down into smaller things which you
*can* do, and the problem becomes easier.
It does, I agree. In theory. But take a look at Andy Walker's "we don't
interview for Notts maths now" anecdote about giving test questions to
potential maths students. If those students had truly absorbed all the
goodness of mathematical (i.e. logical) thinking then the "We haven't done
this in class" response should have been the exception.

I'm not arguing against your essential point here, but looking at the
evidence and results of those people who *HAVE* done "more maths". (Okay,
so it's anecdotal and hardly a controlled study, but it's a prominent enough
behaviour in the right sort of sample of people that it is worthy of being
mentioned.)
Post by Adam
Post by Robert de Vincy
Learning a whole load of mathematical proofs and formulas and whatnots
will not help you use a word-processor, will not show you how to create
a safe and recoverable back-up routine, and will certainly not tell you
if formatting an infected floppy will remove the virus.
I'm gonna latch on to the "safe and recoverable back-up routine" cos I
think that being good at maths and being good at devising such routines
*are* linked (tho, of course, there will be exceptions ... but what I'm
saying is it's easier to teach people to think via maths, than via
studying classic literature and devising arguments and writing essays,
even tho both do the same sortof thing).
Yes, I don't doubt that being taught maths to A-level (and higher) standard
means you should be learning how to think about problems in a methodical
way.

But there are two points about that:

1.
Is it really happening in the Real World? I refer you, again, to Mr Walker's
potential uni students. They, presumably, were very proficient at A-level-
standard mathematics and were clearly eager to continue with their exploration
of mathematics, but did the majority exhibit the sort of logical and
systematic thinking that maths *ought to* induce? From the evidence of the
anecdote, it seems not.
What happened? I can make a few suggestions:
a) the idea that maths teaches logical thinking is false; or
b) the students in the example were a particularly bad sample and do not
exemplify the typical "mathematically trained" student; or
c) for some people, this "training" does stay with them and make a difference
(and these are the ones that Mr Walker described as managing to solve the
problems), but for the majority, this "training" is lost on them.
There are probably other explanations, but these are the ones that I think
are most likely, given no further details or data.

So, the point of this point is: the techniques taught in mathematics should
lead to logical and methodical thinking, but can we see any convincing
evidence of this, except as minority exceptions?

2.
People don't always behave in mathematically logical (i.e. the definition of
"logical" that Ray Pang used elsewhere in this thread) ways. People behave
in the way that people behave, not in mechanically predictable behaviour
patterns. I've hinted at this... frustration?... before on AUA, and I'll
explicitly say it now. Studying computers and maths and logical problem-
solving is fine and dandy for some areas, but it does not equip you with
the right approach to dealing with people. You need something else...
empathy, understanding, call-it-what-you-will. No amount of mathematical
logic will enable you to understand how a person truly interacts with a
computer. There has to be that extra "Well, this is what humans think
and want and feel about such-and-such" so that, for example, a workable
back-up routine will, um, work.
Sure, the underlying process might be produced through methodical application
of logical thinking, but that's not the end of it. It's missing the
vital component, the essential component, the component whose priorities and
needs matter more than anything else: the person using this tool that you
have created. Mathematically logical thinking does not teach you how to cope
with that part.

Oh, and a third (minor) point:
3.
Schools should teach the art of rhetoric. It's amazing how many weak and
feeble arguments appear on Usenet. And -- even more annoyingly -- how many
times people will reply to an argument mid-paragraph (or mid-sentence, even!)
before the actual point has been made, replying directly to the rhetorical
device rather than the real point that is being set-up to be made. I guess
that's the danger inherent in this medium with the ability to "interrupt" a
person's long essay at any point you choose, but I do think the inability to
think about what the person is saying as a whole is missing from so many
people.
Maths might teach logical reasoning, but a little bit of rhetorical training
will make one's arguing technique much more effective.
Post by Adam
My reason is, when doing maths, you can't 'fudge' things ever, you can't
say "oh, well, it's nearly right", you have to make sure that everything
you have done, every step you take, is accurate.
With GCSE and A-level maths there are usually only a few steps and I
reckon most people can grasp that and I reckon it does help with general
life.
Like, doing a backup thing, you need to make sure that all the data you
need is stored, stored somewhere seperate to the original, and stored in
such a manner that it can be recovered.
Contrast this with the kind of thought processes which go on in humanties
subjects. Obviously, if you are doing it properly, it will be similar.
I'm no expert (I've written 2 essays this year and both were utter shite)
but when putting forward a position you have to make sure that it is
logical, one point follows on from the other, make sure there are no
'holes' in your argument, and that any possible counter-arguments are
dealt with.
But when it is taught in school, it is possible to do a great deal wrong
and still get an okay mark.
With maths, if you do a great deal wrong, it's wrong.
And it's easier to show where things have gone wrong.
Hence, it's easier to teach and learn from maths.
Hmm... I wrote a whole chunk of text here, based on what I thought that
last line meant, but then I wondered if you really did mean what I think
you mean. Um. Anyway... what do you mean exactly by that last line?

Do you mean:
a) It is easier to teach from maths and it is easier to learn from maths.
Or:
b) It is easier to teach maths and it is easier to learn from maths.

Regardless of the details of what you did mean, both of my interpretations
lead me to see a "So what use is maths outside of itself?" situation if I
take your 'humanities versus maths' contrasts into consideration. In other
words, you say that maths is one thing, humanities subjects another thing,
and never the two shall meet. If that's the case, then what influence could
maths have on your ability to learn other stuff? (This is getting close to
my post the other week about a theory explaining only itself... "learning
maths will teach you how to learn only maths"!) If that's not what you meant,
then ignore me. More so.
Post by Adam
Obviously, by far the best way to learn anything is through direct
experience with what you are trying to learn. But, if you have an extra
slot in your timetable, then taking more maths will be more useful than
taking media studies, and if you have no preference (i.e. don't *like*
english more than maths, and aren't necessilary better at one or the
other) then maths will be more useful than English.
adam
(ps - i have an open mind on this, i've just written my current thinking
but i'd be really interested to hear your thoughts cos you seem like a
clever chap with the opposite view, and i like clever chaps with opposite
views)
Ah, flattery!

I don't think I have a truly opposite view that I will fight for till the
last breath leaves my body. I'm just considering all the arguments and
positions that could stand on the opposite side. As I hinted at in a reply
elsewhere, AUA does seem to have rather a mathcentric bias. (For some rough-
and-ready proof of this, take a little while to go back through the Google
archives and look at the "IS NE1 DOING QQQQ TOMOROW?!?!?!?" or "DID NE1 DO
QQQQ YESTRDAY?!?!?!?" posts. Work out the percentage of those in which QQQQ
is a maths paper and of those in which QQQQ is not a maths paper.)
There is a whole world of A-level topics out there, and it puzzles me why,
time and again, maths seems to appear in this group and other subjects don't.
--
BdeV
Dr A. N. Walker
2004-07-30 13:55:42 UTC
Permalink
Post by Robert de Vincy
It does, I agree. In theory. But take a look at Andy Walker's "we don't
interview for Notts
"Notts" is County. We are "Nott'm", as in Forest.
Post by Robert de Vincy
maths now" anecdote about giving test questions to
potential maths students. If those students had truly absorbed all the
goodness of mathematical (i.e. logical) thinking then the "We haven't done
this in class" response should have been the exception.
In case it was't clear, that *was* the exception, at least in
the form "We haven't done this and therefore I'm not going to try to
do it now even though I know this is blowing my chances of going to
university"; perhaps < 5%. Many more "We haven't done this, so I have
no idea what to do, could you drop a hint?" responses, perhaps 20%.

[...]
Post by Robert de Vincy
There is a whole world of A-level topics out there, and it puzzles me why,
time and again, maths seems to appear in this group and other subjects don't.
One reason is that asking "I can't remember whether arsenic is
a salt or a metal and I have an exam tomorrow, please help" or "I don't
understand the origins of WW1 ..." etc just seem silly, whereas "Help!
I'm stuck with this integral ..." seems the sort of thing that people
here can just *do*. Another related reason is that maths is the sort
of subject where people feel they need help beyond what is in their
books, whereas in many other subjects the exam questions are solved
word-for-word in the texts. A third is that most subjects are taken
only by students who have chosen to do so and who therefore feel "at
ease" with those subjects, whereas maths is commonly taken by science
and engineering students to whom it is a necessary but very black art.
--
Andy Walker, School of MathSci., Univ. of Nott'm, UK.
***@maths.nott.ac.uk
Robert de Vincy
2004-07-30 16:22:47 UTC
Permalink
Post by Dr A. N. Walker
Post by Robert de Vincy
It does, I agree. In theory. But take a look at Andy Walker's "we
don't interview for Notts
"Notts" is County. We are "Nott'm", as in Forest.
Notted.

Um, I mean noted.
Post by Dr A. N. Walker
Post by Robert de Vincy
maths now" anecdote about giving test questions to
potential maths students. If those students had truly absorbed all
the goodness of mathematical (i.e. logical) thinking then the "We
haven't done this in class" response should have been the exception.
In case it was't clear, that *was* the exception, at least in
the form "We haven't done this and therefore I'm not going to try to
do it now even though I know this is blowing my chances of going to
university"; perhaps < 5%. Many more "We haven't done this, so I
have no idea what to do, could you drop a hint?" responses, perhaps
20%.
Ah. Then my point's strength is somewhat reduced. That's what happens
when you write a reply off-line and rely on memory.
Post by Dr A. N. Walker
[...]
Post by Robert de Vincy
There is a whole world of A-level topics out there, and it puzzles me
why, time and again, maths seems to appear in this group and other
subjects don't.
One reason is that asking "I can't remember whether arsenic is
a salt or a metal and I have an exam tomorrow, please help" or "I
don't understand the origins of WW1 ..." etc just seem silly, whereas
"Help! I'm stuck with this integral ..." seems the sort of thing that
people here can just *do*.
Why would questions like that seem silly to you? I can understand that
the maths question you give here would find a good home in the atmosphere
of AUA, but why are you placing that sort of question in opposition against
the others? I'm not challenging you in a "Oi! They're serious questions,
too, you lout!" sort of way; I'm genuinely interested in what feature(s)
you used to distinguish the two different examples.

My take on the two examples you gave are that, yes, the first one is fairly
silly since all it takes is a few moments to look in a chemistry textbook
for the answer. But the second one doesn't have one definitive answer. It
almost begs to be debated and discussed in a debating and discussing forum!
Post by Dr A. N. Walker
Another related reason is that maths is the sort of subject where people
feel they need help beyond what is in their books, whereas in many other
subjects the exam questions are solved word-for-word in the texts.
Some subjects, yes. I think something like chemistry, maybe.

But what about all the arts and humanities subjects? Those that require
an entire essay per question in the exams? And those that have many
theories/ideas/opinions and require a summing up that discusses each one's
strong and weak points? E.g. in a fictional RE query, "I've read what
Wilberson says about Spalling's analysis of papal bathing customs, but
how does it conflict with what Meijer says in his DE PONTIFICIBVS ET
BALNEIS?"
Post by Dr A. N. Walker
A third is that most subjects are taken only by students who have chosen to
do so and who therefore feel "at ease" with those subjects, whereas maths
is commonly taken by science and engineering students to whom it is a
necessary but very black art.
That's the best reason out of your list, I feel.
--
BdeV
Ray Pang
2004-07-30 17:14:22 UTC
Permalink
Post by Robert de Vincy
Post by Dr A. N. Walker
Post by Robert de Vincy
It does, I agree. In theory. But take a look at Andy Walker's "we
don't interview for Notts
"Notts" is County. We are "Nott'm", as in Forest.
Notted.
Um, I mean noted.
Post by Dr A. N. Walker
Post by Robert de Vincy
maths now" anecdote about giving test questions to
potential maths students. If those students had truly absorbed all
the goodness of mathematical (i.e. logical) thinking then the "We
haven't done this in class" response should have been the exception.
In case it was't clear, that *was* the exception, at least in
the form "We haven't done this and therefore I'm not going to try to
do it now even though I know this is blowing my chances of going to
university"; perhaps < 5%. Many more "We haven't done this, so I
have no idea what to do, could you drop a hint?" responses, perhaps
20%.
Ah. Then my point's strength is somewhat reduced. That's what happens
when you write a reply off-line and rely on memory.
Post by Dr A. N. Walker
[...]
Post by Robert de Vincy
There is a whole world of A-level topics out there, and it puzzles me
why, time and again, maths seems to appear in this group and other
subjects don't.
One reason is that asking "I can't remember whether arsenic is
a salt or a metal and I have an exam tomorrow, please help" or "I
don't understand the origins of WW1 ..." etc just seem silly, whereas
"Help! I'm stuck with this integral ..." seems the sort of thing that
people here can just *do*.
Why would questions like that seem silly to you? I can understand that
the maths question you give here would find a good home in the atmosphere
of AUA, but why are you placing that sort of question in opposition against
the others? I'm not challenging you in a "Oi! They're serious questions,
too, you lout!" sort of way; I'm genuinely interested in what feature(s)
you used to distinguish the two different examples.
My take on the two examples you gave are that, yes, the first one is fairly
silly since all it takes is a few moments to look in a chemistry textbook
for the answer. But the second one doesn't have one definitive answer.
It
almost begs to be debated and discussed in a debating and discussing forum!
The problem is that the responses to that sort of question would be long and
lengthy, and by the time some poor soul eventually gives up and asks on AUA,
(s)he won't have time to wait around. A-level integrals can be done in
seconds by quite a few people on here though.

Tip: If you have an essay to write, put it on AUA straight away and then
just use everybody's opinions to write the essay when it's due to be handed
in. Simple.
Dr A. N. Walker
2004-07-30 17:48:39 UTC
Permalink
Post by Robert de Vincy
My take on the two examples you gave are that, yes, the first one is fairly
silly since all it takes is a few moments to look in a chemistry textbook
for the answer. But the second one doesn't have one definitive answer. It
almost begs to be debated and discussed in a debating and discussing forum!
But people don't get *stuck* on it. Faced with an essay,
you can google for key words, look in your books, etc., and make
an attempt. If you get stuck with a maths problem -- say, an
integral -- then there is no such help unless the *exact* problem
is solved in your text. What's more, you are often stuck with a
blank piece of paper, and no way to lift a corner until someone
has dropped a hint.
--
Andy Walker, School of MathSci., Univ. of Nott'm, UK.
***@maths.nott.ac.uk
Adam
2004-07-30 14:04:23 UTC
Permalink
Post by Robert de Vincy
[...]
Post by Adam
But, and this I suppose is my overall point about maths, it teaches you
how to approach problems, break them down into smaller things which you
*can* do, and the problem becomes easier.
It does, I agree. In theory. But take a look at Andy Walker's "we don't
interview for Notts maths now" anecdote about giving test questions to
potential maths students. If those students had truly absorbed all the
goodness of mathematical (i.e. logical) thinking then the "We haven't done
this in class" response should have been the exception.
I'm not arguing against your essential point here, but looking at the
evidence and results of those people who *HAVE* done "more maths". (Okay,
so it's anecdotal and hardly a controlled study, but it's a prominent enough
behaviour in the right sort of sample of people that it is worthy of being
mentioned.)
I believe (with no actual reason whatsoever) that they answered that more
because they didn't know fully what they were capable of.

If the next line in the interview had been "well try and work it out, let me
see your thought process" then I think that is where strengths in maths
would really be shown.

I think people are too worried about getting it wrong, so they prefer to
start off with an excuse for not even trying.
Post by Robert de Vincy
Post by Adam
Post by Robert de Vincy
Learning a whole load of mathematical proofs and formulas and whatnots
will not help you use a word-processor, will not show you how to create
a safe and recoverable back-up routine, and will certainly not tell you
if formatting an infected floppy will remove the virus.
I'm gonna latch on to the "safe and recoverable back-up routine" cos I
think that being good at maths and being good at devising such routines
*are* linked (tho, of course, there will be exceptions ... but what I'm
saying is it's easier to teach people to think via maths, than via
studying classic literature and devising arguments and writing essays,
even tho both do the same sortof thing).
Yes, I don't doubt that being taught maths to A-level (and higher) standard
means you should be learning how to think about problems in a methodical
way.
1.
Is it really happening in the Real World? I refer you, again, to Mr Walker's
potential uni students. They, presumably, were very proficient at A-level-
standard mathematics and were clearly eager to continue with their exploration
of mathematics, but did the majority exhibit the sort of logical and
systematic thinking that maths *ought to* induce? From the evidence of the
anecdote, it seems not.
a) the idea that maths teaches logical thinking is false; or
b) the students in the example were a particularly bad sample and do not
exemplify the typical "mathematically trained" student; or
c) for some people, this "training" does stay with them and make a difference
(and these are the ones that Mr Walker described as managing to solve the
problems), but for the majority, this "training" is lost on them.
There are probably other explanations, but these are the ones that I think
are most likely, given no further details or data.
So, the point of this point is: the techniques taught in mathematics should
lead to logical and methodical thinking, but can we see any convincing
evidence of this, except as minority exceptions?
I think evidence of logical and methodical thinking is pretty hard to find,
even if it was commonplace.
Post by Robert de Vincy
2.
People don't always behave in mathematically logical (i.e. the definition of
"logical" that Ray Pang used elsewhere in this thread) ways. People behave
in the way that people behave, not in mechanically predictable behaviour
patterns. I've hinted at this... frustration?... before on AUA, and I'll
explicitly say it now. Studying computers and maths and logical problem-
solving is fine and dandy for some areas, but it does not equip you with
the right approach to dealing with people. You need something else...
empathy, understanding, call-it-what-you-will. No amount of mathematical
logic will enable you to understand how a person truly interacts with a
computer. There has to be that extra "Well, this is what humans think
and want and feel about such-and-such" so that, for example, a workable
back-up routine will, um, work.
Sure, the underlying process might be produced through methodical application
of logical thinking, but that's not the end of it. It's missing the
vital component, the essential component, the component whose priorities and
needs matter more than anything else: the person using this tool that you
have created. Mathematically logical thinking does not teach you how to cope
with that part.
Right, the best way (IMO) to try to understand people is to look at what
they have done in the past. Like, if I tell a girl she is ugly, I get a
slap. No amount of maths could have predicted that, but I have now learnt
that is the outcome and I know not to do it again.

I think that maths helps to teach people to learn. Even when dealing with
people, you need to be able to spot patterns in behavour, extrapolate from
past events etc. These things are learnt through experience, but I think
mathematical training will make it easier to learn these things. Perhaps
not obviously, and there would be no clear link (people won't start thinking
"Mr X did Y last week under these circumstances, so he will do Z tomorrow")
but, I guess it's just a gut feeling I have.
Post by Robert de Vincy
3.
Schools should teach the art of rhetoric. It's amazing how many weak and
feeble arguments appear on Usenet. And -- even more annoyingly -- how many
times people will reply to an argument mid-paragraph (or mid-sentence, even!)
before the actual point has been made, replying directly to the rhetorical
device rather than the real point that is being set-up to be made. I guess
that's the danger inherent in this medium with the ability to "interrupt" a
person's long essay at any point you choose, but I do think the inability to
think about what the person is saying as a whole is missing from so many
people.
Maths might teach logical reasoning, but a little bit of rhetorical training
will make one's arguing technique much more effective.
Agreed.

Tho they shouldn't be taught it until they are old enough to use it properly
;-)

Which brings me to another point, people think schools should teach whatever
it is they are doing. So, a philosopher would want kids to be taught lots
and lots of philosphy. A mathematican would want schools to teach lots and
lots of maths. A linguist would want kids to learn lots of languages. Each
person would have a strong argument as to why they are correct, and why
their subject is the most important.

So how can we possibly know what is right?
Post by Robert de Vincy
Post by Adam
My reason is, when doing maths, you can't 'fudge' things ever, you can't
say "oh, well, it's nearly right", you have to make sure that everything
you have done, every step you take, is accurate.
With GCSE and A-level maths there are usually only a few steps and I
reckon most people can grasp that and I reckon it does help with general
life.
Like, doing a backup thing, you need to make sure that all the data you
need is stored, stored somewhere seperate to the original, and stored in
such a manner that it can be recovered.
Contrast this with the kind of thought processes which go on in humanties
subjects. Obviously, if you are doing it properly, it will be similar.
I'm no expert (I've written 2 essays this year and both were utter shite)
but when putting forward a position you have to make sure that it is
logical, one point follows on from the other, make sure there are no
'holes' in your argument, and that any possible counter-arguments are
dealt with.
But when it is taught in school, it is possible to do a great deal wrong
and still get an okay mark.
With maths, if you do a great deal wrong, it's wrong.
And it's easier to show where things have gone wrong.
Hence, it's easier to teach and learn from maths.
Hmm... I wrote a whole chunk of text here, based on what I thought that
last line meant, but then I wondered if you really did mean what I think
you mean. Um. Anyway... what do you mean exactly by that last line?
a) It is easier to teach from maths and it is easier to learn from maths.
b) It is easier to teach maths and it is easier to learn from maths.
Regardless of the details of what you did mean, both of my interpretations
lead me to see a "So what use is maths outside of itself?" situation if I
take your 'humanities versus maths' contrasts into consideration. In other
words, you say that maths is one thing, humanities subjects another thing,
and never the two shall meet. If that's the case, then what influence could
maths have on your ability to learn other stuff? (This is getting close to
my post the other week about a theory explaining only itself... "learning
maths will teach you how to learn only maths"!) If that's not what you meant,
then ignore me. More so.
I think I meant b - it is easier to teach maths and it is easier to learn
from maths.

But I'm probably only saying that because I got an A* in maths GCSE and a C
in English.

In my reasoning above, I was treating 'humanities' as mainly what you called
'rhetoric'. I don't really remember much of GCSE and I didn't do any
humanities A-levels, but I did politics last year and we were told in our
essays to write the strongest arguments for our case that we could, and that
was what would give us the marks.

And I could see the link between the work I do in maths of stating a
proposition, then working through a proof, covering every angle of attack
which might make the proof invalid, and finishing with something which shows
beyond any possible doubt that the proposition is correct.
Post by Robert de Vincy
Post by Adam
Obviously, by far the best way to learn anything is through direct
experience with what you are trying to learn. But, if you have an extra
slot in your timetable, then taking more maths will be more useful than
taking media studies, and if you have no preference (i.e. don't *like*
english more than maths, and aren't necessilary better at one or the
other) then maths will be more useful than English.
adam
(ps - i have an open mind on this, i've just written my current thinking
but i'd be really interested to hear your thoughts cos you seem like a
clever chap with the opposite view, and i like clever chaps with opposite
views)
Ah, flattery!
I don't think I have a truly opposite view that I will fight for till the
last breath leaves my body. I'm just considering all the arguments and
positions that could stand on the opposite side. As I hinted at in a reply
elsewhere, AUA does seem to have rather a mathcentric bias. (For some rough-
and-ready proof of this, take a little while to go back through the Google
archives and look at the "IS NE1 DOING QQQQ TOMOROW?!?!?!?" or "DID NE1 DO
QQQQ YESTRDAY?!?!?!?" posts. Work out the percentage of those in which QQQQ
is a maths paper and of those in which QQQQ is not a maths paper.)
There is a whole world of A-level topics out there, and it puzzles me why,
time and again, maths seems to appear in this group and other subjects don't.
Cos maths people are geeks and arty people don't frequent 'newsgroups'.

Pretty much every 'arty' person in here has some (unhealthy?) interest in
computers.

adam
Robert de Vincy
2004-07-30 17:09:15 UTC
Permalink
Adam did write:

[snips to stop this being 1000 lines long]
Post by Adam
I think evidence of logical and methodical thinking is pretty hard to
find, even if it was commonplace.
But there *MUST* be effects that can be observed if you base your whole
argument on "Do more maths and you will do better at other things", yes?

If there is no measurable or even visible-yet-not-in-any-way-quanitifiable
advantage to doing "more maths" then what is the effect you are hoping to
achieve? "Do this thing here, and you will gain an advantage that can't
be seen or measured!" Umm, I don't think that's really what you're
trying to say, is it?
Post by Adam
Right, the best way (IMO) to try to understand people is to look at
what they have done in the past. Like, if I tell a girl she is ugly,
I get a slap. No amount of maths could have predicted that, but I
have now learnt that is the outcome and I know not to do it again.
I think that maths helps to teach people to learn. Even when dealing
with people, you need to be able to spot patterns in behavour,
extrapolate from past events etc. These things are learnt through
experience, but I think mathematical training will make it easier to
learn these things. Perhaps not obviously, and there would be no
clear link (people won't start thinking "Mr X did Y last week under
these circumstances, so he will do Z tomorrow") but, I guess it's just
a gut feeling I have.
Hmm, this is sounding suspiciously like a "It's not the details that
matter, but merely the academic rigour that is needed to do the subject"
position.

Yes?
Post by Adam
Post by Robert de Vincy
3.
Schools should teach the art of rhetoric. It's amazing how many weak
and feeble arguments appear on Usenet. And -- even more annoyingly
-- how many times people will reply to an argument mid-paragraph (or
mid-sentence, even!) before the actual point has been made, replying
directly to the rhetorical device rather than the real point that is
being set-up to be made. I guess that's the danger inherent in this
medium with the ability to "interrupt" a person's long essay at any point
you choose, but I do think the inability to think about what the person
is saying as a whole is missing from so many people.
Maths might teach logical reasoning, but a little bit of rhetorical
training will make one's arguing technique much more effective.
Agreed.
Tho they shouldn't be taught it until they are old enough to use it
properly ;-)
heh.
Scared of being verbally beaten up by a gang of 14-year-olds?
Post by Adam
Which brings me to another point, people think schools should teach
whatever it is they are doing. So, a philosopher would want kids to
be taught lots and lots of philosphy. A mathematican would want
schools to teach lots and lots of maths. A linguist would want kids
to learn lots of languages.
Learning ABOUT lots of languages, yes. More importantly, unlearning
the awful social conditioning that people attach to language varieties.
We've managed to introduce legislation against discrimination based on
sex, religion, ethnic origin, disability, but we still -- all of us,
unconsciously -- automatically judge and pigeon-hole someone the moment
he/she says something, based not on the words but the way the words are
spoken, and nothing 'official' even vaguely condemns this.
Post by Adam
Each person would have a strong argument as to why they are correct, and
why their subject is the most important.
So how can we possibly know what is right?
Easy: ask a linguist. They never lie, honest!
Post by Adam
And I could see the link between the work I do in maths of stating a
proposition, then working through a proof, covering every angle of
attack which might make the proof invalid, and finishing with
something which shows beyond any possible doubt that the proposition
is correct.
But if you've learnt a "science" properly (any science), then surely
you should have learnt the basic Scientific Method of "observe,
hypothesize, predict, experiment/test". Rigorous analysis with tons
of supporting proof is not just the sole property of mathematics.
Post by Adam
Post by Robert de Vincy
There is a whole world of A-level topics out there, and it puzzles me
why, time and again, maths seems to appear in this group and other
subjects don't.
Cos maths people are geeks and arty people don't frequent 'newsgroups'.
Pretty much every 'arty' person in here has some (unhealthy?) interest
in computers.
True.
--
BdeV
Ray Pang
2004-07-30 17:18:30 UTC
Permalink
Post by Robert de Vincy
[snips to stop this being 1000 lines long]
Post by Adam
I think evidence of logical and methodical thinking is pretty hard to
find, even if it was commonplace.
But there *MUST* be effects that can be observed if you base your whole
argument on "Do more maths and you will do better at other things", yes?
If there is no measurable or even visible-yet-not-in-any-way-quanitifiable
advantage to doing "more maths" then what is the effect you are hoping to
achieve? "Do this thing here, and you will gain an advantage that can't
be seen or measured!" Umm, I don't think that's really what you're
trying to say, is it?
OK. Maths graduates are the second highest paid after Law graduates. I think
I read that somewhere.
Dr A. N. Walker
2004-07-30 17:51:57 UTC
Permalink
Post by Ray Pang
OK. Maths graduates are the second highest paid after Law graduates. I think
I read that somewhere.
Actually they're the highest paid *including* law; I know
I read that somewhere; and now you have too. Mind you, I think the
survey was done at age 24 or some such, and it might be different by
age 30, 40, 73 or whatever. And it excluded maths lecturers.
--
Andy Walker, School of MathSci., Univ. of Nott'm, UK.
***@maths.nott.ac.uk
Ray Pang
2004-07-30 18:23:21 UTC
Permalink
Post by Dr A. N. Walker
Post by Ray Pang
OK. Maths graduates are the second highest paid after Law graduates. I think
I read that somewhere.
Actually they're the highest paid *including* law;
Cool!
Post by Dr A. N. Walker
And it excluded maths lecturers.
He he, my heart bleeds!
Adam
2004-07-30 20:32:45 UTC
Permalink
Post by Robert de Vincy
[snips to stop this being 1000 lines long]
Post by Adam
I think evidence of logical and methodical thinking is pretty hard to
find, even if it was commonplace.
But there *MUST* be effects that can be observed if you base your whole
argument on "Do more maths and you will do better at other things", yes?
If there is no measurable or even visible-yet-not-in-any-way-quanitifiable
advantage to doing "more maths" then what is the effect you are hoping to
achieve? "Do this thing here, and you will gain an advantage that can't
be seen or measured!" Umm, I don't think that's really what you're
trying to say, is it?
I said it's hard to find, it's elusive, difficult to measure. Doesn't mean
it doesn't exist.

The average earnings is a good one, and I have bagfuls of anecdotal evidence
that maths people are the cleverest people I know, not just at maths, but in
general, they are able to grasp new ideas better and quicker, and draw
conclusions from them faster.
Post by Robert de Vincy
Post by Adam
Right, the best way (IMO) to try to understand people is to look at
what they have done in the past. Like, if I tell a girl she is ugly,
I get a slap. No amount of maths could have predicted that, but I
have now learnt that is the outcome and I know not to do it again.
I think that maths helps to teach people to learn. Even when dealing
with people, you need to be able to spot patterns in behavour,
extrapolate from past events etc. These things are learnt through
experience, but I think mathematical training will make it easier to
learn these things. Perhaps not obviously, and there would be no
clear link (people won't start thinking "Mr X did Y last week under
these circumstances, so he will do Z tomorrow") but, I guess it's just
a gut feeling I have.
Hmm, this is sounding suspiciously like a "It's not the details that
matter, but merely the academic rigour that is needed to do the subject"
position.
Yes?
Right.... It's like any kind of mental activity.

If I were to spend my summer as I have done already, drinking copious
amounts of alcohol (I made a damn good cosmopolitan now) and having
incredibly inactive days, then my brain with wither and die and I'll find
work very hard next term.

However, if I was to keep my brain active, read lots, practise maths, learn
new stuff which might not necessarily have anything to do with my degree,
then I'm gonna do better.
Post by Robert de Vincy
Post by Adam
Post by Robert de Vincy
3.
Schools should teach the art of rhetoric. It's amazing how many weak
and feeble arguments appear on Usenet. And -- even more annoyingly
-- how many times people will reply to an argument mid-paragraph (or
mid-sentence, even!) before the actual point has been made, replying
directly to the rhetorical device rather than the real point that is
being set-up to be made. I guess that's the danger inherent in this
medium with the ability to "interrupt" a person's long essay at any point
you choose, but I do think the inability to think about what the person
is saying as a whole is missing from so many people.
Maths might teach logical reasoning, but a little bit of rhetorical
training will make one's arguing technique much more effective.
Agreed.
Tho they shouldn't be taught it until they are old enough to use it
properly ;-)
heh.
Scared of being verbally beaten up by a gang of 14-year-olds?
Yes.

And also it was a kinda obscure Plato reference, together with the next
paragraph ... was wondering if anyone was gonna notice it.
Post by Robert de Vincy
Post by Adam
Which brings me to another point, people think schools should teach
whatever it is they are doing. So, a philosopher would want kids to
be taught lots and lots of philosphy. A mathematican would want
schools to teach lots and lots of maths. A linguist would want kids
to learn lots of languages.
Learning ABOUT lots of languages, yes. More importantly, unlearning
the awful social conditioning that people attach to language varieties.
We've managed to introduce legislation against discrimination based on
sex, religion, ethnic origin, disability, but we still -- all of us,
unconsciously -- automatically judge and pigeon-hole someone the moment
he/she says something, based not on the words but the way the words are
spoken, and nothing 'official' even vaguely condemns this.
Post by Adam
Each person would have a strong argument as to why they are correct, and
why their subject is the most important.
So how can we possibly know what is right?
Easy: ask a linguist. They never lie, honest!
Post by Adam
And I could see the link between the work I do in maths of stating a
proposition, then working through a proof, covering every angle of
attack which might make the proof invalid, and finishing with
something which shows beyond any possible doubt that the proposition
is correct.
But if you've learnt a "science" properly (any science), then surely
you should have learnt the basic Scientific Method of "observe,
hypothesize, predict, experiment/test". Rigorous analysis with tons
of supporting proof is not just the sole property of mathematics.
Ah, but all science is, is applied maths :)

Maths is the pure form of everything.

But I suppose my point is that in maths, it is most obvious and all stated
most formally.

In science people can get away with post hoc ergo propter hoc, and it is
often used in extremeist political arguments, but in maths it simply isn't
possible to use that in a proof. You wouldn't get marks for "observation"
or "spelling and grammar".

adam
Robert de Vincy
2004-07-30 21:41:21 UTC
Permalink
Post by Adam
Post by Robert de Vincy
[snips to stop this being 1000 lines long]
Post by Adam
I think evidence of logical and methodical thinking is pretty hard
to find, even if it was commonplace.
But there *MUST* be effects that can be observed if you base your
whole argument on "Do more maths and you will do better at other
things", yes?
If there is no measurable or even
visible-yet-not-in-any-way-quanitifiable advantage to doing "more
maths" then what is the effect you are hoping to achieve? "Do this
thing here, and you will gain an advantage that can't be seen or
measured!" Umm, I don't think that's really what you're trying to
say, is it?
I said it's hard to find, it's elusive, difficult to measure. Doesn't
mean it doesn't exist.
So, your advice now is to struggle with something that is (according to
other posts that have appeared in this thread) a mysterious and arcane
subject that many (note: many) people just don't "get" (and, statistically
speaking, you WILL be one of those, oh yes <insert smiley giving a knowing
wink, alluding to some other recent discussion -- it's true, a picture can
be worth a thousand words! Or, in this case, 12>) to receive some very
minuscule and/or difficult to perceive benefit? I hope you don't ever
become a salesman...
Post by Adam
The average earnings is a good one,
I'm unimpressed by earnings. According to that reasoning, professional
football players are something we should be encouraging all our children
to be.
Post by Adam
and I have bagfuls of anecdotal evidence that maths people are the
cleverest people I know, not just at maths, but in general, they are
able to grasp new ideas better and quicker, and draw conclusions from
them faster.
You might have bagfuls of evidence that you know some clever people who
also do maths, but is that evidence that maths caused the cleverness?
If you do have that evidence, then PLEASE POST IT NOW! I would be
convinced beyond all measure.
Post by Adam
Post by Robert de Vincy
Hmm, this is sounding suspiciously like a "It's not the details that
matter, but merely the academic rigour that is needed to do the
subject" position.
Yes?
Right.... It's like any kind of mental activity.
Not just maths? Whew! That means my current reading and summer-studying
schedule really will help me when I return to uni in October after all.
Post by Adam
If I were to spend my summer as I have done already, drinking copious
amounts of alcohol (I made a damn good cosmopolitan now) and having
incredibly inactive days, then my brain with wither and die and I'll
find work very hard next term.
However, if I was to keep my brain active, read lots, practise maths,
learn new stuff which might not necessarily have anything to do with
my degree, then I'm gonna do better.
Yeah. Of course.
Post by Adam
And also it was a kinda obscure Plato reference, together with the
next paragraph ... was wondering if anyone was gonna notice it.
I didn't. Deduct 5 points from my score.
Post by Adam
Ah, but all science is, is applied maths :)
I am SO glad you put that smiley there, or I'd be incensed so much that
I wouldn't be able to think coherently, reply to your message, or even
stand up for at least two days.
Post by Adam
Maths is the pure form of everything.
I disagree. Strongly. For several reasons.

1.
"everything"? Show me how mathematics can describe the effect that a
summer evening's sunset has on my state of mind and how it can conjure up
related memories of good times. Show me how mathematics can describe the
way I feel about pizza. Show me how mathematics can describe the joy of
being in love. Show me how mathematics can describe a lion accurately to
someone who has never seen one and make him appreciate the power and the
ferocity, and the beauty.

2.
To say that something is "the pure form of everything" is getting a little
too close to Essentialism, and that's a concept I don't... can't... subscribe
to.

3.
Mathematics is not the substratum upon which reality is built. It exists
in parallel, allowing us (or someone else who's not me, more likely!) to
model some phenomenon in formal terms. It is NOT the underlying building
blocks that "everything" is built up from. Yeah, you can reduce a lot of
things to a mathematical model or formula or whatever, but when you look at
that result you're not looking at the actual thing broken down, but rather
a simulation of the thing (broken down).

See, this is part of the "exalting maths too far" thing that I mentioned
in an earlier message. To me, it's conflating the actual event with the
newspaper report you read the next day. It might be a very, very (VERY)
accurate report (with colour pictures and everything!) with every minute
particle described to within an infinite degree of detail, but it's still
not the event itself. You might be able to use that report to inform
someone else and pass on the details of what happened, answering each and
every question with supreme accuracy, but you're not describing the actual
event, just a very precise analogue of it.

Before I'm attacked by the Maths Mob, I'm not relegating maths to the
equivalent of some 2nd-rate photocopying machine. What I am saying is
that if you photocopy the Magna Carta on *THE* highest quality photocopier
so that even an expert couldn't tell one from the other just by looking
at them, which would you say is the REAL Magna Carta? Have you somehow
"captured" the essence of the Magna Carta and imbued some blank sheet of
paper with it so that you now have two actual Magna Cartas? (Or "Magnae
Cartae"?) No. You have one genuine article, and an abso-fucking-lutely
accurate copy that is still not the actual thing.
Post by Adam
But I suppose my point is that in maths, it is most obvious and all
stated most formally.
In science people can get away with post hoc ergo propter hoc,
They can?!?
Post by Adam
and it is often used in extremeist political arguments,
Valid ones, though? Ones that actually stand up to immense scrutiny?
Post by Adam
but in maths it simply isn't possible to use that in a proof. You wouldn't
get marks for "observation" or "spelling and grammar".
"marks"? Are you talking exams here? Sure, you don't get marks in an
exam for observation. There just isn't the time! You can't take a few
hours out and explore something until you see a pattern or some relevant
data emerging. Exams are horribly artificial in that you're force-fed
selected data and that's usually all you have to work with. Ugh.

As for "marks for" "spelling and grammar", I'm not sure what you mean
by that, exactly. If you write the most penetrating analysis of, say,
1980s Communist Russia and how the foreign policy of that decade led to
the country's break-up, does it really affect your argument if you
misspell "Andropov"?
--
BdeV
Ray Pang
2004-07-30 22:18:25 UTC
Permalink
Post by Robert de Vincy
Post by Adam
The average earnings is a good one,
I'm unimpressed by earnings. According to that reasoning, professional
football players are something we should be encouraging all our children
to be.
Indeed, earnings make not an impressive man, but it is a measure that gives
at least a little backing to the idea that "maths is a good thing to do lots
of, especially if you're good it it". I don't think Adam was saying that we
should encourage our kids to do more maths purely because the earnings are
probably going to be higher.
Post by Robert de Vincy
Post by Adam
Maths is the pure form of everything.
I disagree. Strongly. For several reasons.
1.
"everything"? Show me how mathematics can describe the effect that a
summer evening's sunset has on my state of mind and how it can conjure up
related memories of good times. Show me how mathematics can describe the
way I feel about pizza. Show me how mathematics can describe the joy of
being in love. Show me how mathematics can describe a lion accurately to
someone who has never seen one and make him appreciate the power and the
ferocity, and the beauty.
No, the only reason none of us can do that is because the reasons are not
yet known. Not because maths can't explain it. Put them down to witchcraft
for the time being, until we can understand them.
Post by Robert de Vincy
2.
To say that something is "the pure form of everything" is getting a little
too close to Essentialism, and that's a concept I don't... can't... subscribe
to.
3.
Mathematics is not the substratum upon which reality is built. It exists
in parallel, allowing us (or someone else who's not me, more likely!) to
model some phenomenon in formal terms. It is NOT the underlying building
blocks that "everything" is built up from.
Says who? I happen to think that it could well be the defining, er, thing
that, er, things must follow. Got no proof for that at all, but I can't see
why not.
Post by Robert de Vincy
Yeah, you can reduce a lot of
things to a mathematical model or formula or whatever, but when you look at
that result you're not looking at the actual thing broken down, but rather
a simulation of the thing (broken down).
No, that's science. That's different. You're describing modelling the real
world with maths. I believe that maths governs what can and will happen. I
emphasise that it is a belief. I also believe that you can't prove that
maths governs what can and will happen.
Post by Robert de Vincy
See, this is part of the "exalting maths too far" thing that I mentioned
in an earlier message. To me, it's conflating the actual event with the
newspaper report you read the next day. It might be a very, very (VERY)
accurate report (with colour pictures and everything!) with every minute
particle described to within an infinite degree of detail, but it's still
not the event itself.
No, but it is. If every single detail is represented mathematically, then it
IS the event. Just abstracted into letters and symbols of ink on paper. But
they are one and the same.
Post by Robert de Vincy
You might be able to use that report to inform
someone else and pass on the details of what happened, answering each and
every question with supreme accuracy, but you're not describing the actual
event, just a very precise analogue of it.
No, if you can pass on every detail, you are describing the actual event.
Post by Robert de Vincy
Before I'm attacked by the Maths Mob, I'm not relegating maths to the
equivalent of some 2nd-rate photocopying machine. What I am saying is
that if you photocopy the Magna Carta on *THE* highest quality photocopier
so that even an expert couldn't tell one from the other just by looking
at them, which would you say is the REAL Magna Carta?
I don't follow that analogy. Maths isn't about describing things, IMO. Maths
IS what things are. Things happen according to the laws of maths.
Post by Robert de Vincy
Have you somehow
"captured" the essence of the Magna Carta and imbued some blank sheet of
paper with it so that you now have two actual Magna Cartas? (Or "Magnae
Cartae"?) No. You have one genuine article, and an abso-fucking-lutely
accurate copy that is still not the actual thing.
You're describing mathematical modelling, not maths itself.
Robert de Vincy
2004-07-30 22:39:05 UTC
Permalink
Ray Pang did write:

[...]
Post by Ray Pang
You're describing mathematical modelling, not maths itself.
Your other points sort of lead me to suspect you view the world (reality,
that is) in a different way to me, and there's no way either one of us
can prove conclusively enough to the other that his version is right.

But...

Show me some "maths". If I have the wrong idea about maths then I want
to be put right about it.

If you write a formula (or whatever) and say "Look, this is such-and-such"
I can't see how that formula is anything but a formal (etymology alert!)
description of some phenomenon. It isn't the phenomenon itself. It's not
like you can reach into it, pull out its essential mathematical-ness, and
somehow denature nature. Or can you?
Post by Ray Pang
Maths IS what things are.
How so? What do you mean by "things"? Physical things? Then are you
saying that all matter is just the interaction of numbers? Actual hold-
em-in-your-hand numbers? That's reification gone waaaaaaay too far for
me.

If not physical "things" then... what do you mean here?
Post by Ray Pang
Things happen according to the laws of maths.
"Law" is a very loaded term. I do hope you're not using it here to mean
something that is set in mythical stone tablets and was decreed by...
God, nature, chance, whatever... and from this all existence then sprang.
Sort of a "mind before matter" thing (with variable definition of "mind",
depending on your beliefs).

These laws are purely descriptive, are they not? They just state what
happens and what doesn't happen, in various contexts. I can't believe that
they have any power of their own underneath reality. That is, if the laws
were different then reality would be different. Isn't it more that if
reality were different then the laws describing it would have to be
different?

So, explain some real "maths" and show me how it is not just a way to
explain the patterning of some phenomenon.
--
BdeV
Ray Pang
2004-07-30 22:47:02 UTC
Permalink
Post by Robert de Vincy
[...]
Post by Ray Pang
You're describing mathematical modelling, not maths itself.
Your other points sort of lead me to suspect you view the world (reality,
that is) in a different way to me, and there's no way either one of us
can prove conclusively enough to the other that his version is right.
Absolutely. It's an opinion, just like I could say that "no, it's not maths,
it's God."
Post by Robert de Vincy
But...
Show me some "maths". If I have the wrong idea about maths then I want
to be put right about it.
If you write a formula (or whatever) and say "Look, this is such-and-such"
I can't see how that formula is anything but a formal (etymology alert!)
description of some phenomenon. It isn't the phenomenon itself. It's not
like you can reach into it, pull out its essential mathematical-ness, and
somehow denature nature. Or can you?
Well, in the abstract world, 2+3 = 5. Hence, in the real world which follows
maths, if I have two apples and you give me three apples, I have five. This
is different to saying, in the real world, if I have two apples and you give
me three, I have five, and hence the maths that 2+3=5 is a model of the
phenomena.
Post by Robert de Vincy
Post by Ray Pang
Maths IS what things are.
How so? What do you mean by "things"? Physical things? Then are you
saying that all matter is just the interaction of numbers? Actual hold-
em-in-your-hand numbers? That's reification gone waaaaaaay too far for
me.
Well, it's just my belief. All things are physical, all physical things are
governed by maths.
Post by Robert de Vincy
If not physical "things" then... what do you mean here?
Post by Ray Pang
Things happen according to the laws of maths.
"Law" is a very loaded term. I do hope you're not using it here to mean
something that is set in mythical stone tablets and was decreed by...
God, nature, chance, whatever... and from this all existence then sprang.
Sort of a "mind before matter" thing (with variable definition of "mind",
depending on your beliefs).
These laws are purely descriptive, are they not? They just state what
happens and what doesn't happen, in various contexts. I can't believe that
they have any power of their own underneath reality. That is, if the laws
were different then reality would be different. Isn't it more that if
reality were different then the laws describing it would have to be
different?
Not in my opinion. Reality is a subset of maths. Maths contains all that is
real and all that is possible.
Post by Robert de Vincy
So, explain some real "maths" and show me how it is not just a way to
explain the patterning of some phenomenon.
See above, with the 2 apples thing. Maths explains the phenomenon because
the phenomenon happens because of maths.
Robert de Vincy
2004-07-30 23:39:42 UTC
Permalink
Ray Pang did write:

[snip here and there]
Post by Ray Pang
Well, in the abstract world, 2+3 = 5.
That's where I come to a screeching halt. My mind is screaming "Why
are we starting from the abstract?!?"
Post by Ray Pang
Hence, in the real world which follows maths, if I have two apples and
you give me three apples, I have five. This is different to saying, in
the real world, if I have two apples and you give me three, I have five,
and hence the maths that 2+3=5 is a model of the phenomena.
It is different in what-obeys-what, but is it any different in
observable proof? If we can hold up the two different versions, see that
both accurately fit into the pattern we are observing, ask "But which one
has most proof?", and find that neither wins on proof alone, then why
choose one over the other, except out of personal preference and/or
beliefs? I'm not asking you to justify your beliefs to me, but I am
curious about why you would pick one viewpoint over another, given that
there isn't any tangible proof weighing in to tip the balance either
way.

Basically, what I'm saying is: why assume there is an abstract plane of
something like Platonic ideals in which maths lives, and then reality
gets modelled on it?
That, for me, has too many factors. It presumes that reality exists
(which is fair enough, I guess!) but it also presumes that there is
another ... something. Another plane, place, realm, abstract storage
area where all these ideas live independently of reality but they make
the reality dance to their tune. Isn't it simpler to see that only reality
exists, and then humans have created in (to take a term from my recent
readings) their "ideosphere" a system that accurately (or strives to be
as accurate as possible) describes all the stuff that's going on in
reality?
Post by Ray Pang
Post by Robert de Vincy
So, explain some real "maths" and show me how it is not just a way to
explain the patterning of some phenomenon.
See above, with the 2 apples thing. Maths explains the phenomenon
because the phenomenon happens because of maths.
To me, that's describing what happens when you place 2 apples and 3
apples together and treat them as a contiguous group. I don't see how
there's a rule of "2+3=5" floating about in abstract space that forces
two apples and three apples to make five apples. Well, I can imagine
how it might be, but I seriously doubt that it does. As I said above,
it involves setting up more things than are strictly necessary to still
arrive at the same result.
--
BdeV
Ray Pang
2004-07-31 09:12:28 UTC
Permalink
Post by Robert de Vincy
[snip here and there]
Post by Ray Pang
Well, in the abstract world, 2+3 = 5.
That's where I come to a screeching halt. My mind is screaming "Why
are we starting from the abstract?!?"
Post by Ray Pang
Hence, in the real world which follows maths, if I have two apples and
you give me three apples, I have five. This is different to saying, in
the real world, if I have two apples and you give me three, I have five,
and hence the maths that 2+3=5 is a model of the phenomena.
It is different in what-obeys-what, but is it any different in
observable proof? If we can hold up the two different versions, see that
both accurately fit into the pattern we are observing, ask "But which one
has most proof?", and find that neither wins on proof alone, then why
choose one over the other, except out of personal preference and/or
beliefs? I'm not asking you to justify your beliefs to me, but I am
curious about why you would pick one viewpoint over another, given that
there isn't any tangible proof weighing in to tip the balance either
way.
Basically, what I'm saying is: why assume there is an abstract plane of
something like Platonic ideals in which maths lives, and then reality
gets modelled on it?
That, for me, has too many factors. It presumes that reality exists
(which is fair enough, I guess!) but it also presumes that there is
another ... something. Another plane, place, realm, abstract storage
area where all these ideas live independently of reality but they make
the reality dance to their tune. Isn't it simpler to see that only reality
exists, and then humans have created in (to take a term from my recent
readings) their "ideosphere" a system that accurately (or strives to be
as accurate as possible) describes all the stuff that's going on in
reality?
No, that sounds like a cop out to me. I refuse to believe that maths is
invented as a means of explaining things. It just seems perfectly feasible
that there is this system which is just "what is", and hence everything
follows. I can't explain why I feel that.
Post by Robert de Vincy
Post by Ray Pang
Post by Robert de Vincy
So, explain some real "maths" and show me how it is not just a way to
explain the patterning of some phenomenon.
See above, with the 2 apples thing. Maths explains the phenomenon
because the phenomenon happens because of maths.
To me, that's describing what happens when you place 2 apples and 3
apples together and treat them as a contiguous group. I don't see how
there's a rule of "2+3=5" floating about in abstract space that forces
two apples and three apples to make five apples. Well, I can imagine
how it might be, but I seriously doubt that it does. As I said above,
it involves setting up more things than are strictly necessary to still
arrive at the same result.
Well it's because you differ in opinion. I think that Maths is the system,
and hence apples, being part of the system, follow the rules of the system.

I think it's too philosophical for me to convince you of my beliefs, as you
have too strong a belief otherwise.
Matthew Huntbach
2004-08-02 10:03:20 UTC
Permalink
Post by Robert de Vincy
Post by Ray Pang
Well, in the abstract world, 2+3 = 5.
That's where I come to a screeching halt. My mind is screaming "Why
are we starting from the abstract?!?"
Post by Ray Pang
Hence, in the real world which follows maths, if I have two apples and
you give me three apples, I have five. This is different to saying, in
the real world, if I have two apples and you give me three, I have five,
and hence the maths that 2+3=5 is a model of the phenomena.
...
Post by Robert de Vincy
Basically, what I'm saying is: why assume there is an abstract plane of
something like Platonic ideals in which maths lives, and then reality
gets modelled on it?
So you want to have to learn

"Two apples plus three apples makes five apples"
"Two oranges plus three oranges makes five oranges"
"Two trombones plus three trombones makes five trombones"
"Two devices for taking stones out of horseshoes plus three devices for
taking stones out of horseshoes makes five devices for taking stones out of
horshoes"

and so on for every possible type of thing?

Making the step

"Two X plus three X makes five X"

for any X is indeed abstraction, but don't you find it useful?

Matthew Huntbach
Robert de Vincy
2004-08-02 14:31:22 UTC
Permalink
Post by Matthew Huntbach
Post by Robert de Vincy
Post by Ray Pang
Well, in the abstract world, 2+3 = 5.
That's where I come to a screeching halt. My mind is screaming "Why
are we starting from the abstract?!?"
Post by Ray Pang
Hence, in the real world which follows maths, if I have two apples and
you give me three apples, I have five. This is different to saying, in
the real world, if I have two apples and you give me three, I have five,
and hence the maths that 2+3=5 is a model of the phenomena.
...
Post by Robert de Vincy
Basically, what I'm saying is: why assume there is an abstract plane of
something like Platonic ideals in which maths lives, and then reality
gets modelled on it?
So you want to have to learn
"Two apples plus three apples makes five apples"
"Two oranges plus three oranges makes five oranges"
"Two trombones plus three trombones makes five trombones"
"Two devices for taking stones out of horseshoes plus three devices for
taking stones out of horseshoes makes five devices for taking stones out of
horshoes"
and so on for every possible type of thing?
Making the step
"Two X plus three X makes five X"
for any X is indeed abstraction, but don't you find it useful?
Yes, but that's not what either Ray (as I understand his messages) or I
were saying.

He was (I presume) suggesting that first there's the rule (2x+3x=5x) existing
in its abstract 'place', then reality is based on what the rule says.

I'm suggestinging that there's reality, and then we define the rule (2x+3x=5x)
to describe what we are seeing happen in reality.
--
BdeV
Dr A. N. Walker
2004-08-02 13:42:38 UTC
Permalink
Post by Robert de Vincy
If you write a formula (or whatever) and say "Look, this is such-and-such"
I can't see how that formula is anything but a formal (etymology alert!)
description of some phenomenon. It isn't the phenomenon itself.
Right; this is maths, not physics. But OTOH that formula
can give deep insight into the nature of what is "really" happening.
Post by Robert de Vincy
It's not
like you can reach into it, pull out its essential mathematical-ness, and
somehow denature nature. Or can you?
If I understood the question [there are three "it"s, of
uncertain antecedentry], I might know the answer. But see below.

[...]
Post by Robert de Vincy
So, explain some real "maths" and show me how it is not just a way to
explain the patterning of some phenomenon.
Firstly, you asked [but I snipped] about numbers. Most people
do not really understand numbers. We sort-of know about small integers.
But the relations between ordinal and cardinal numbers, infinities of
various kinds, infinitesimals ditto, irrational, transcendental, real
vs complex, connexions between any of these and other phenomena such
as geometry, games, etc., and between any of these and the real world
are barely touched on at school, and not fully explored at univ either.

But anyway, rather than trying to abstract "2+3 = 5", I'd like
to suggest the following as a more fruitful example. The Pythagorean
theories of music were much more successful than their religion, which
disproved itself when sqrt(2) was found to be irrational. Those theories
explained harmony in relation to things like lengths of plucked strings.
Later, Newtonian mechanics plus Fourier analysis plus some relatively
OK theory of partial differential equations explained virtually the
whole phenomenon of music -- its physics, not its aesthetics! -- in
terms of the so-called "wave equation": acceleration proportional to
curvature, with the coefficient of proportionality being the square
of the "speed" of the "wave". [Part of the aesthetics too can be
explained in terms of the maths -- esp Pythagorean harmony and more
obscure stuff such as piano tunings -- but not, AFAIK, what makes
Mozart a greater composer than Salieri.]

Look at a vibrating string [violin], a vibrating column of air
[organ, flute], a vibrating surface [drum] or rod [triangle], almost
any "wave" motion, and the same equation is found, by the physics, to
apply. Conversely, if you find any phenomenon in which sine waves
occur, it's a fair bet that the same "wave equation" underlies what
is going on. So, we also get pressure waves, in which "curvature" is
replaced by pressure variations, and thus sound waves in air or water,
able to transmit vibrations from musical instruments to our ears.
And water waves, surface ripples satisfying the same equation and so
exhibiting the same physical characteristics. Then you get the Maxwell
equations of electromagnetism. Lo and behold, in simple cases, these
turn out to be wave equations whose "speed" is, experimentally, found
to be the speed of light -- hence the leap that (a) electricity and
magnetism can form waves [a new phenomenon to the Victorians, leading
to radio etc in the 20thC] and (b) light is just an example of electro-
magnetic waves [and the discovery of the e-m spectrum of which visible
light is just a small part]. Later physics seems to show that there may
be gravity waves, particle waves, etc., etc.

The point is that the wave equation can be abstracted away to
"second derivative in *this* direction proportional to second derivative
in *that*" [one "direction" being time in many, but not all, cases].
If you *ever* see wave phenomena, then you can suspect that there may
well be an underlying wave equation, so you get physical insight into
what is happening. And conversely, if you ever see, coming out of the
physics, a wave equation, then you can expect to see waves happening,
with a speed given by the coefficient in the equation. This ties
together a huge range of phenomena -- meaning that we can explain
things happening in music or in quantum mechanics by things that we
are more familiar with in water waves or light waves [or vice versa].
Everything that happens in music/water/light/etc happens *also* in
all the others.

Is that not useful? I don't know whether it's the sort of
thing you are looking for.
--
Andy Walker, School of MathSci., Univ. of Nott'm, UK.
***@maths.nott.ac.uk
Robert de Vincy
2004-08-02 14:54:03 UTC
Permalink
Post by Dr A. N. Walker
Post by Robert de Vincy
So, explain some real "maths" and show me how it is not just a way to
explain the patterning of some phenomenon.
Firstly, you asked [but I snipped] about numbers. Most people
do not really understand numbers. We sort-of know about small integers.
But the relations between ordinal and cardinal numbers, infinities of
various kinds, infinitesimals ditto, irrational, transcendental, real
vs complex, connexions between any of these and other phenomena such
as geometry, games, etc., and between any of these and the real world
are barely touched on at school, and not fully explored at univ either.
But anyway, rather than trying to abstract "2+3 = 5", I'd like
to suggest the following as a more fruitful example. The Pythagorean
theories of music were much more successful than their religion, which
disproved itself when sqrt(2) was found to be irrational. Those theories
explained harmony in relation to things like lengths of plucked strings.
Later, Newtonian mechanics plus Fourier analysis plus some relatively
OK theory of partial differential equations explained virtually the
whole phenomenon of music -- its physics, not its aesthetics! -- in
terms of the so-called "wave equation": acceleration proportional to
curvature, with the coefficient of proportionality being the square
of the "speed" of the "wave". [Part of the aesthetics too can be
explained in terms of the maths -- esp Pythagorean harmony and more
obscure stuff such as piano tunings -- but not, AFAIK, what makes
Mozart a greater composer than Salieri.]
Look at a vibrating string [violin], a vibrating column of air
[organ, flute], a vibrating surface [drum] or rod [triangle], almost
any "wave" motion, and the same equation is found, by the physics, to
apply. Conversely, if you find any phenomenon in which sine waves
occur, it's a fair bet that the same "wave equation" underlies what
is going on. So, we also get pressure waves, in which "curvature" is
replaced by pressure variations, and thus sound waves in air or water,
able to transmit vibrations from musical instruments to our ears.
And water waves, surface ripples satisfying the same equation and so
exhibiting the same physical characteristics. Then you get the Maxwell
equations of electromagnetism. Lo and behold, in simple cases, these
turn out to be wave equations whose "speed" is, experimentally, found
to be the speed of light -- hence the leap that (a) electricity and
magnetism can form waves [a new phenomenon to the Victorians, leading
to radio etc in the 20thC] and (b) light is just an example of electro-
magnetic waves [and the discovery of the e-m spectrum of which visible
light is just a small part]. Later physics seems to show that there may
be gravity waves, particle waves, etc., etc.
The point is that the wave equation can be abstracted away to
"second derivative in *this* direction proportional to second derivative
in *that*" [one "direction" being time in many, but not all, cases].
If you *ever* see wave phenomena, then you can suspect that there may
well be an underlying wave equation, so you get physical insight into
what is happening. And conversely, if you ever see, coming out of the
physics, a wave equation, then you can expect to see waves happening,
with a speed given by the coefficient in the equation. This ties
together a huge range of phenomena -- meaning that we can explain
things happening in music or in quantum mechanics by things that we
are more familiar with in water waves or light waves [or vice versa].
Everything that happens in music/water/light/etc happens *also* in
all the others.
Is that not useful?
Useful, certainly.
Post by Dr A. N. Walker
I don't know whether it's the sort of thing you are looking for.
It doesn't explain to me nor convince me how it is the mathematics (as Ray
Pang suggests) which creates the reality of those phenemona and not (as I
suggest) the phenomena which exist initially leading to the mathematics
describing them and their (shared) behaviours.

I'm not disputing that, for example, the behaviour of waves can be described
in mathematical terms and these can be shown to be true/useful/predictive for
all wave-like phenomena. That's not the issue.

The issue is: What is the underlying "layer" from which things arise?

Saying (as Ray Pang did earlier) that mathematics is like some sort of
foundation upon which reality is actually created (not just described or shown
to be acting like, but actually physically built from) is something I can't
believe, as it involves a sort of "mind before matter" Prime Mover leap-of-
faith which I can't accommodate, for several reasons. The main reason is that
it builds up extra levels of complexity (in having to assume there is an Ideal
Plane which existed before reality) to achieve the same ends as thinking that
the behaviour of reality is just, well, there, and mathematics can then be
applied to help describe the behaviour that we're seeing.
--
BdeV
Dr A. N. Walker
2004-08-02 16:01:09 UTC
Permalink
Post by Robert de Vincy
The issue is: What is the underlying "layer" from which things arise?
Again, I don't know whether it helps, but there is at least
a level/sense at/in which maths can literally create itself out of
nothing. Even before there is anything, there is an existing set of
things, viz the empty set [containing nothing]. Call that set "0"
[just a name, nothing new happening yet]. Then we can construct a
set containing 0; call that set "1" [ditto]. Then a new set which
contains 1, called "2". Etc. With some pushing and squeezing, we
can construct the whole of mathematics this way. [There is actually
a much better way to do this using games, but I digress.] IOW, a
"prime mover" could build maths before needing to construct any sort
of physical universe. *Then* the laws of physics, if built along
the lines of Dirac's version of quantum mechanics, seem to allow the
capability of building physical objects out of a vacuum. After that
we're up and running.

So pure maths => applied maths => physics => reality.

It's quite possible that the universe is the way it is because
that's the only way it can bootstrap itself into existence *and* then
allow for intelligence that could understand the above argument.

At some stage here, I usually recommend that people read
Eddington, who is sadly unfashionable these days. "Fundamental
Theory" tells you all you need to know; but the Clifford algebras
in the middle are somewhat heavy going for the uninitiated.
--
Andy Walker, School of MathSci., Univ. of Nott'm, UK.
***@maths.nott.ac.uk
Robert de Vincy
2004-08-02 17:12:33 UTC
Permalink
Post by Dr A. N. Walker
Post by Robert de Vincy
The issue is: What is the underlying "layer" from which things arise?
Again, I don't know whether it helps, but there is at least
a level/sense at/in which maths can literally create itself out of
nothing. Even before there is anything, there is an existing set of
things, viz the empty set [containing nothing]. Call that set "0"
[just a name, nothing new happening yet]. Then we can construct a
set containing 0; call that set "1" [ditto]. Then a new set which
contains 1, called "2". Etc. With some pushing and squeezing, we
can construct the whole of mathematics this way.
Okay, I understand that bit (after about 5 minutes of "What the...?"
followed by a couple of minutes of "Ah-hah! That's what he means!").
Post by Dr A. N. Walker
[There is actually a much better way to do this using games, but I
digress.] IOW, a "prime mover" could build maths before needing to
construct any sort of physical universe. *Then* the laws of physics,
if built along the lines of Dirac's version of quantum mechanics,
seem to allow the capability of building physical objects out of a
vacuum. After that we're up and running.
So pure maths => applied maths => physics => reality.
I detect the doubt by your use of "seem to allow the capability..."

And that's the bit I can't just swallow wholesale without further proof.

Why would it be that your sequence of events is the way it happened and
not the other way? I.e. physical reality starts -- we don't know how
it starts in this theory, since all I'm trying to do is establish what
is based on what: reality out of mathematical principles, or maths based
on reality's behaviour? -- and then we can look at the way reality is
behaving and derive some generalizations which can be formalized into
mathematic formulas, proofs, truths, the whole ball of wax that makes up
mathematics.

Is there some small piece of evidence that tips the scales towards the
"Maths First" theory, something that I've missed or has not been mentioned
yet?
Post by Dr A. N. Walker
It's quite possible that the universe is the way it is because
that's the only way it can bootstrap itself into existence *and* then
allow for intelligence that could understand the above argument.
Or, equally, it's the way it is because this is the only pattern of
behaviour that actually results in us being here and now. And *THEN* we
can look at this particular bunch of phenomena (the ones that produce us
and this existence) and extract our mathematical bits-and-pieces to give
the underlying patterns some form.
Post by Dr A. N. Walker
At some stage here, I usually recommend that people read
Eddington, who is sadly unfashionable these days. "Fundamental
Theory" tells you all you need to know; but the Clifford algebras
in the middle are somewhat heavy going for the uninitiated.
Would it address the problem I'm trying so desperately to put across here?

To stop this drifting any further, my reasoning is in response to two
related replies. Namely Adam who said that "maths is the pure form of
everything" and Ray who believes that "the phenomenon happens because of
maths".

I'm taking the opposite view, namely "the phenomenon happens [we don't
know why it does but it does], and we can use maths to describe the pattern
of behaviour that occurs".
--
BdeV
Ray Pang
2004-08-02 20:50:19 UTC
Permalink
Post by Robert de Vincy
Post by Dr A. N. Walker
Post by Robert de Vincy
The issue is: What is the underlying "layer" from which things arise?
Again, I don't know whether it helps, but there is at least
a level/sense at/in which maths can literally create itself out of
nothing. Even before there is anything, there is an existing set of
things, viz the empty set [containing nothing]. Call that set "0"
[just a name, nothing new happening yet]. Then we can construct a
set containing 0; call that set "1" [ditto]. Then a new set which
contains 1, called "2". Etc. With some pushing and squeezing, we
can construct the whole of mathematics this way.
Okay, I understand that bit (after about 5 minutes of "What the...?"
followed by a couple of minutes of "Ah-hah! That's what he means!").
Post by Dr A. N. Walker
[There is actually a much better way to do this using games, but I
digress.] IOW, a "prime mover" could build maths before needing to
construct any sort of physical universe. *Then* the laws of physics,
if built along the lines of Dirac's version of quantum mechanics,
seem to allow the capability of building physical objects out of a
vacuum. After that we're up and running.
So pure maths => applied maths => physics => reality.
I detect the doubt by your use of "seem to allow the capability..."
And that's the bit I can't just swallow wholesale without further proof.
Why would it be that your sequence of events is the way it happened and
not the other way? I.e. physical reality starts -- we don't know how
it starts in this theory, since all I'm trying to do is establish what
is based on what: reality out of mathematical principles, or maths based
on reality's behaviour? -- and then we can look at the way reality is
behaving and derive some generalizations which can be formalized into
mathematic formulas, proofs, truths, the whole ball of wax that makes up
mathematics.
No no, we can't possibly ovserve reality's behaviour and come up with
anything as concrete as a mathematical proof or mathematical truths. For
example, what real world phenomena can we observe to categorically say that
sqrt(2) is irrational, or that 2+3=5? Now, assuming we have mathematics, we
can categorically say that two oranges plus three oranges always always
gives five oranges. Where the maths comes from is irrelevant - OK us humans
may have realised that 2+3=5 without actually having any concrete maths, but
the system we invented called maths is completely sound (I think, maybe I'm
out of my depth here), and thus that system tells us that we MUST have five
oranges.
Post by Robert de Vincy
Is there some small piece of evidence that tips the scales towards the
"Maths First" theory, something that I've missed or has not been mentioned
yet?
Post by Dr A. N. Walker
It's quite possible that the universe is the way it is because
that's the only way it can bootstrap itself into existence *and* then
allow for intelligence that could understand the above argument.
Or, equally, it's the way it is because this is the only pattern of
behaviour that actually results in us being here and now. And *THEN* we
can look at this particular bunch of phenomena (the ones that produce us
and this existence) and extract our mathematical bits-and-pieces to give
the underlying patterns some form.
No, that is precisely what physics and scientific endeavour is. There's a
very big difference between that discipline and mathematics.
Post by Robert de Vincy
Post by Dr A. N. Walker
At some stage here, I usually recommend that people read
Eddington, who is sadly unfashionable these days. "Fundamental
Theory" tells you all you need to know; but the Clifford algebras
in the middle are somewhat heavy going for the uninitiated.
Would it address the problem I'm trying so desperately to put across here?
To stop this drifting any further, my reasoning is in response to two
related replies. Namely Adam who said that "maths is the pure form of
everything" and Ray who believes that "the phenomenon happens because of
maths".
I'm taking the opposite view, namely "the phenomenon happens [we don't
know why it does but it does], and we can use maths to describe the pattern
of behaviour that occurs".
That's not what I understood your point to be. We can indeed use maths to
describe the pattern of behaviour that occurs - I don't doubt that. The
reason why we can do this is because reality is following maths in the first
place, so quite naturally we can use maths to describe it. We don't know why
things happen either. I don't know why 2+3=5 - it just does. I understand
it's something to do with set theory, coming from a starting point which
isn't a question, it's just a statement (seemingly confirmed by Dr Walker's
posts earlier).
Robert de Vincy
2004-08-02 22:12:09 UTC
Permalink
Ray Pang did write:

[snip]
The reason why we can do this is because reality is following maths in
the first place, so quite naturally we can use maths to describe it.
You keep saying that, but where is the final bit of evidence that makes
this the 'correct' view and my view 'incorrect'?

All you've done so far is just say "No, you're wrong. I can't explain
why, you just are." Well, forgive me for not being totally convinced
by that reasoning.

I am hoping that you can see this with an open mind. Or, at least, open
enough to see what my position is and how it relates to what you are
saying. Let's assume you do.
I also hope that you can see that both positions are likely. If you look
at both positions (non-judgementally, without the bias of investing X
amount of years of your life on a maths degree course), I am desperately
hoping you can see that from all the evidence I have been presented with
in this thread, there is nothing solid to choose either position. Nothing
has been mentioned in a reply that could be the final clincher and make
me go "Ah, yes! That's the bit of evidence that tips it in the favour of
'Maths first, stuff afterwards'!"

So, still assuming that you can see there is equal validity to both sides
(the big clue here is that there is no tangible difference -- if we take
your view to be right then the final result is exactly the same as that
if we take my view to be right). So, what's to choose? Why do you go
with the "Maths first" opinion? Is it purely because you've spent 3(?)
years studying maths to university level? Or are you (and Andy Walker)
hiding some vital clue from me?
OK us humans may have realised that 2+3=5 without actually having any
concrete maths, but the system we invented called maths is completely sound
[...], and thus that system tells us that we MUST have five oranges.
How does this convince me (and, indeed, you) that there exists an abstract
bunch of mathematical rules (etc) that were created or formed or came
about before there was a reality upon which they could operate? This,
to me, implies that such rules (etc) must exist "somewhere". From such a
view, they can't merely exist as an abstract, or else how do they interact
with reality and impose their order on it?

As an alternative, the view I'm suggesting is that reality just happens.
Stuff happens. Phenomena occur. And, because this is all consistent,
we can generalize and create our own abstract realm full of mathematical
rules. We can see that every time we add 2 things to 3 things, we will
always get 5 of those things. It's the way the world works, and because
it is this wat (as opposed to some other way), then we create the idea
"2 + 3 = 5" -- we can test it out: add 2 apples and 3 apples... voilà!
5 apples! It's a good description -- a highly accurate description -- of
what happens, so we keep it and apply it whenever the conditions (of
having 2 items added to 3 items) pertains.

Just like we (as humans) can create fictional worlds, schools of thought
that have their set of ideas, philosophies, religions, etc, we have an
Idea Space that we fill with mathematical things. The main difference
between those creations and the mathematical Idea Space we have created
is that the only ideas that can enter (or, at least, continue to exist
there after being thoroughly scrutinized and examined) into the mathematical
Idea Space are those which accurately describe the consistent behaviours
of what we find in reality. This Mathematical Idea Space must exist even
in your "Maths First" view, as it is the sum knowledge of what humans
know about mathematics, and we certainly don't know every fact and detail
that can be known about mathematics.

I realize that I'm guilty of the same charge that I raise in my first
sentence of this reply.
So, here is why I edge towards my version of "Reality first, maths after":
simplicity.

It seems to me that to get an identical end result (what we can see all
around us and find out through various methods of investigation), we
can either go your way which necessitates the existence of:
1-an abstract realm in which mathematical rules live, influencing reality;
2-reality;
3-a realm of human-created abstracts into which we pour all of our acquired
mathematical rules (it's obvious we don't know them all, or else reserach
in maths would be a fruitless task).
My way requires the existence of:
1-reality;
2-a realm of human-created abstracts into which we pour all of our acquired
mathematical rules.

The question of "Whence your initial abstract realm of mathematics?" (place
#1 in "your" list above) also leads me too close to the concept of a Creator
or some... thing... capable of sustaining the abstracts until they are
sufficient in their number and capabilities to build a reality. AND! The
underlying question is left unanswered, namely:
Where did mathematics come from?
- In your view, its origin is unexplained. It just sort of "is". As a
First Cause sorta thing. Did anything cause the First Cause, etc, etc
towards infinite regression.
- In my view, its origin is explained by saying maths is a very accurate
description of the patterns of behaviour we can discover in reality.
(Which doesn't answer "How does reality originate?", but let's try to
tackle one question at a time, yeah?)


I'm now running out of ways to say the same thing. I hope you can apply
some of the much-touted "logical and analytical thinking" that mathematics
inspires and see exactly what position I'm arguing from. And why.
--
BdeV
Ray Pang
2004-08-02 23:13:28 UTC
Permalink
Post by Robert de Vincy
[snip]
The reason why we can do this is because reality is following maths in
the first place, so quite naturally we can use maths to describe it.
You keep saying that, but where is the final bit of evidence that makes
this the 'correct' view and my view 'incorrect'?
All you've done so far is just say "No, you're wrong. I can't explain
why, you just are." Well, forgive me for not being totally convinced
by that reasoning.
I did say earlier that is was a belief of mine, rather than it being the
case that the cold truth that I must be right.
You can't expect me to put "This is purely a belief of mine because I don't
have any proof of it, but..." before every statement I make.
Post by Robert de Vincy
I am hoping that you can see this with an open mind. Or, at least, open
enough to see what my position is and how it relates to what you are
saying. Let's assume you do.
I also hope that you can see that both positions are likely.
Well they're quite contrasting so how they can both be likely is one for the
probability theorists.
Post by Robert de Vincy
If you look
at both positions (non-judgementally, without the bias of investing X
amount of years of your life on a maths degree course), I am desperately
hoping you can see that from all the evidence I have been presented with
in this thread, there is nothing solid to choose either position. Nothing
has been mentioned in a reply that could be the final clincher and make
me go "Ah, yes! That's the bit of evidence that tips it in the favour of
'Maths first, stuff afterwards'!"
Well if there was, then somebody on this newsgroup would be a complete
genius. All I'm trying to do is convince you of why I believe in what I
believe, and why I don't think what you believe is that case. Sadly,
everytime I come up with something to put your idea down, I think of a
counterargument (which is a bit of a fickle point, but that's partly the
point) which I can't address.
Post by Robert de Vincy
So, still assuming that you can see there is equal validity to both sides
(the big clue here is that there is no tangible difference -- if we take
your view to be right then the final result is exactly the same as that
if we take my view to be right). So, what's to choose? Why do you go
with the "Maths first" opinion? Is it purely because you've spent 3(?)
years studying maths to university level? Or are you (and Andy Walker)
hiding some vital clue from me?
From what I understand, you're confusing physics with maths.
Post by Robert de Vincy
OK us humans may have realised that 2+3=5 without actually having any
concrete maths, but the system we invented called maths is completely sound
[...], and thus that system tells us that we MUST have five oranges.
How does this convince me (and, indeed, you) that there exists an abstract
bunch of mathematical rules (etc) that were created or formed or came
about before there was a reality upon which they could operate? This,
to me, implies that such rules (etc) must exist "somewhere". From such a
view, they can't merely exist as an abstract, or else how do they interact
with reality and impose their order on it?
I don't believe that they have to exist anywhere. It's just a system which
is self-consistent (you can't argue with that). Thus reality cannot possibly
disobey maths, otherwise it would mean that maths is not consistent, which
isn't the case.

Now let's try it the other way around. We want to show that reality governs
maths. Let's suppose not. Then that means reality does not govern maths, and
so there is maths out there which reality doesn't describe. I can't see any
contradiction here. While that doesn't prove that reality DOESN'T govern
maths, it gives me a reason to swing that way.
Post by Robert de Vincy
As an alternative, the view I'm suggesting is that reality just happens.
Stuff happens. Phenomena occur. And, because this is all consistent,
we can generalize and create our own abstract realm full of mathematical
rules. We can see that every time we add 2 things to 3 things, we will
always get 5 of those things. It's the way the world works, and because
it is this wat (as opposed to some other way), then we create the idea
"2 + 3 = 5" -- we can test it out: add 2 apples and 3 apples... voilà!
5 apples! It's a good description -- a highly accurate description -- of
what happens, so we keep it and apply it whenever the conditions (of
having 2 items added to 3 items) pertains.
That's science. If we take that view, then because we've seen that two
apples plus three apples gives five apples, *then* we invent the maths that
2+3=5. Testing this with oranges holds true, so we assume it holds true for
all things. There is no logical validity to this, and I hope you can see how
this sort of approach leads to problems. Now maths being a consistent system
means that 2+3=5, and logical deduction therefore implies that two apples
plus three apples equals five apples. That's how I think it works, and that
is what I think makes the difference. I can't put my opinion down any more
clearly than that. It's not because I invested 4 years doing a maths
degree - a maths degree doesn't cover any of this philosophical stuff and
its content is unaffected whether you're right and I'm wrong, vice versa, or
if we're both right or both wrong.
Post by Robert de Vincy
Just like we (as humans) can create fictional worlds, schools of thought
that have their set of ideas, philosophies, religions, etc, we have an
Idea Space that we fill with mathematical things. The main difference
between those creations and the mathematical Idea Space we have created
is that the only ideas that can enter (or, at least, continue to exist
there after being thoroughly scrutinized and examined) into the mathematical
Idea Space are those which accurately describe the consistent behaviours
of what we find in reality.
No they're not. People had concepts of more than three dimensions long
before any notion of multi-dimensionness could be thought of as existing in
reality. That's what I'm getting at. What is the reality equivalent of
C^2755 (where C is the set of complex numbers)? It's a mathematical object
that simply has no (obvious) parallel in reality. Which suggests to me that
maths can't all come from observation of reality, some of it is through pure
abstraction. One day somebody might have thought, "What's four dimensional
space like?" and come up with lots of remarkable patterns of four
dimensional space that then transpired had a parallel with reality. Does
that seem too unreasonable.
Post by Robert de Vincy
This Mathematical Idea Space must exist even
in your "Maths First" view, as it is the sum knowledge of what humans
know about mathematics, and we certainly don't know every fact and detail
that can be known about mathematics.
I don't see what you're getting at there. Of course we don't know all about
maths. But do we need to in order to understand reality? Is there maths
which doesn't bear the slightest resemblance to reality? I'm not the one to
say definitively, but my gut feeling is "definitely".
Post by Robert de Vincy
I realize that I'm guilty of the same charge that I raise in my first
sentence of this reply.
simplicity.
It seems to me that to get an identical end result (what we can see all
around us and find out through various methods of investigation), we
1-an abstract realm in which mathematical rules live, influencing reality;
2-reality;
3-a realm of human-created abstracts into which we pour all of our acquired
mathematical rules (it's obvious we don't know them all, or else reserach
in maths would be a fruitless task).
No, my way does not say that reality needs to exist. My way says that
reality is part of maths. My way suggests that existence of maths implies
existence of reality, since reality is something that we accept adheres to
rule and regulation. And the "realm in which mathematical rules live" bit is
a bit of a strange way of putting it too. I prefer to see it as a realm OF
mathematical rules. Nor does my way require 3). The maths can perfectly well
exist without anybody knowing.
Post by Robert de Vincy
1-reality;
2-a realm of human-created abstracts into which we pour all of our acquired
mathematical rules.
The question of "Whence your initial abstract realm of mathematics?" (place
#1 in "your" list above) also leads me too close to the concept of a Creator
or some... thing... capable of sustaining the abstracts until they are
sufficient in their number and capabilities to build a reality. AND! The
Where did mathematics come from?
- In your view, its origin is unexplained. It just sort of "is". As a
First Cause sorta thing. Did anything cause the First Cause, etc, etc
towards infinite regression.
- In my view, its origin is explained by saying maths is a very accurate
description of the patterns of behaviour we can discover in reality.
(Which doesn't answer "How does reality originate?", but let's try to
tackle one question at a time, yeah?)
Well no, let's not tackle either, because that's not what was being debated,
and both are quite silly questions that nobody has an answer to. It's not a
case of what comes before whichever is first, it's a case of which is first
out of the two.
Post by Robert de Vincy
I'm now running out of ways to say the same thing. I hope you can apply
some of the much-touted "logical and analytical thinking" that mathematics
inspires and see exactly what position I'm arguing from. And why.
Yes, I can, and because you haven't put any concrete backing to what you
believe, I can't accept what you say any more than what I say. I'm only
bothering with all this because I'm making the assumption that you are
open-minded.
Dr A. N. Walker
2004-08-03 14:03:17 UTC
Permalink
[...] We can see that every time we add 2 things to 3 things, we will
always get 5 of those things. It's the way the world works, and because
it is this wat (as opposed to some other way), then we create the idea
"2 + 3 = 5" -- we can test it out: add 2 apples and 3 apples... voil^`!
5 apples! It's a good description -- a highly accurate description -- of
what happens, so we keep it and apply it whenever the conditions (of
having 2 items added to 3 items) pertains.
In "2+3=5", there is a tendency (a) to concentrate on the "2",
"3", "5" and forget the "+", "=", and (b) to assume that it is always
true. "2+3=5" as a piece of maths is largely definition -- "5" is
*defined* to be "4'", where "'" is the successor function, etc. --
and logical theorem -- eg "a + b' = a' + b". It's all done properly
in some rather advanced maths texts. And then of course we love the
abstractions -- two apples plus three apples makes five apples.
What about 2 o'clock plus 3 o'clock makes 5 o'clock? 29th August +
7 days = 36th August? Two friends plus three friends makes five
friends [how do we know they even all know each other]? Where did we
insist that the apples are all different? In what circumstances do
two half glasses of water make one glass?

In each case, we need to know what is meant by the plus and
makes operators. If these turn out to be abstractions of the maths
operators, then there is nothing of interest when two X plus three X
makes five X. But there are lots of bits of maths/physics/etc where
"plus"/"equals" do not have their abstract integer-arithmetic meaning.
This happens especially with "divides"; there is usually *some* way
to add things, if only by imagining them physically co-located, but
"sharing" is a different matter. "Every mathematician" knows that
you can't divide one matrix/vector/graph by another [which does not
stop lots of weak students trying].

IOW, you can divide 24 by 4 and get 6: 24/4 = 6. So you can
imagine dividing 24 inches by 4 inches to get 6 [but *not* 6 inches],
which might be a useful calculation when measuring up for tiles; you
might share 24 apples among 4 people and get 6 apples each; but you
can't divide 24 apples by 4 apples.
--
Andy Walker, School of MathSci., Univ. of Nott'm, UK.
***@maths.nott.ac.uk
Dr A. N. Walker
2004-08-03 12:20:53 UTC
Permalink
Post by Robert de Vincy
[...] IOW, a "prime mover" could build maths before needing to
construct any sort of physical universe. *Then* the laws of physics,
if built along the lines of Dirac's version of quantum mechanics,
seem to allow the capability of building physical objects out of a
vacuum. After that we're up and running.
So pure maths => applied maths => physics => reality.
I detect the doubt by your use of "seem to allow the capability..."
Well, I'm not a physicist.
Post by Robert de Vincy
And that's the bit I can't just swallow wholesale without further proof.
We tend to think of a vacuum as being empty. AIUI, it's more
like a "sea level". Just "under" the vacuum it's full of "water", but
the bit we occupy is mostly devoid of it, until some wave comes along.
Quantum mechanics gives you lots of waves, even without wind or tide.
So you could be sitting on the shore, not realising that water ever
does anything [no wind, no tide], then suddenly QM comes along and
drenches you as a "fluctuation", creating a visible universe of water
in our world in the process.
Post by Robert de Vincy
Why would it be that your sequence of events is the way it happened and
not the other way? I.e. physical reality starts -- we don't know how
it starts in this theory, since all I'm trying to do is establish what
is based on what: reality out of mathematical principles, or maths based
on reality's behaviour? [...]
What would it mean for reality to start *unless* it obeyed
some physical laws? Even HHGTTG can't try to run with a universe
that it completely lawless, only with one in which the laws are
stranger than we realise. How would any law be expressible if not
in terms of mathematics?
Post by Robert de Vincy
Is there some small piece of evidence that tips the scales towards the
"Maths First" theory, something that I've missed or has not been mentioned
yet?
Worth noting that *if* the universe is even somewhat accidental,
"universe first, laws/physics/maths later", then it is much more likely
than not that laws would have frequent exceptions. *Our* neck of the
woods may perhaps have to be regular and law-abiding for epistemological
reasons [if the Sun did not shine essentially uniformly for billions of
years then we wouldn't be here, and if any relatively nearby star did
something wild, that too could kill us all], but we really don't care
what physics/chemistry are like in the remoter galaxies. So we could
expect that remote galaxies would have quite different laws, if any
at all; but our observations of them seem to imply that their laws are
identical with ours. But "laws of physics the same everywhere and at
all times" is itself a strong law of physics, which has profound
implications on what physics laws are possible. The theoretical physics
that investigates those implications is just [part of] applied maths,
itself built on pure maths.

The universe is too much of a coincidence to have built
itself and *then* decided how to work. [This remains true even
if the nature of lawlessness is that it decays, so that the
universe becomes more lawful as it evolves.]
Post by Robert de Vincy
At some stage here, I usually recommend that people read
Eddington, who is sadly unfashionable these days. "Fundamental
Theory" tells you all you need to know; but the Clifford algebras
in the middle are somewhat heavy going for the uninitiated.
Would it address the problem I'm trying so desperately to put across here?
Partly, though maths and physics have evolved a long way in
the last 60 years. But Eddington in general was a great populariser
of relativity, epistemology and numerology. For example, can it be
a pure coincidence that the smallest elementary particles are the
size of the uncertainty in position of the entire universe?
--
Andy Walker, School of MathSci., Univ. of Nott'm, UK.
***@maths.nott.ac.uk
Adam
2004-07-30 22:40:00 UTC
Permalink
Post by Robert de Vincy
Post by Adam
Post by Robert de Vincy
[snips to stop this being 1000 lines long]
Post by Adam
I think evidence of logical and methodical thinking is pretty hard
to find, even if it was commonplace.
But there *MUST* be effects that can be observed if you base your
whole argument on "Do more maths and you will do better at other
things", yes?
If there is no measurable or even
visible-yet-not-in-any-way-quanitifiable advantage to doing "more
maths" then what is the effect you are hoping to achieve? "Do this
thing here, and you will gain an advantage that can't be seen or
measured!" Umm, I don't think that's really what you're trying to
say, is it?
I said it's hard to find, it's elusive, difficult to measure. Doesn't
mean it doesn't exist.
So, your advice now is to struggle with something that is (according to
other posts that have appeared in this thread) a mysterious and arcane
subject that many (note: many) people just don't "get" (and, statistically
speaking, you WILL be one of those, oh yes <insert smiley giving a knowing
wink, alluding to some other recent discussion -- it's true, a picture can
be worth a thousand words! Or, in this case, 12>) to receive some very
minuscule and/or difficult to perceive benefit? I hope you don't ever
become a salesman...
Post by Adam
The average earnings is a good one,
I'm unimpressed by earnings. According to that reasoning, professional
football players are something we should be encouraging all our children
to be.
Post by Adam
and I have bagfuls of anecdotal evidence that maths people are the
cleverest people I know, not just at maths, but in general, they are
able to grasp new ideas better and quicker, and draw conclusions from
them faster.
You might have bagfuls of evidence that you know some clever people who
also do maths, but is that evidence that maths caused the cleverness?
If you do have that evidence, then PLEASE POST IT NOW! I would be
convinced beyond all measure.
It, of course, isn't, but the whole point of this thread is me trying to
convince you that it might be and you trying to convince me otherwise.

What we need to do, is take two identical people, teach one person maths,
and don't teach the other person maths, and watch them throughout their
life, and see, at the age of 20, whether one person is noticeably cleverer
than the other.

However that isn't quite possible. So what else can we do? We can look for
evidence that mathematical thinking in helping with 'real world' things.
Not just adding up or anything to do with numbers at all, but general
problem solving. Like putting a flat-pack bed together, or ironing a shirt.
I believe there is a lot of maths in common sense.


<little snip>
Post by Robert de Vincy
Post by Adam
Ah, but all science is, is applied maths :)
I am SO glad you put that smiley there, or I'd be incensed so much that
I wouldn't be able to think coherently, reply to your message, or even
stand up for at least two days.
Post by Adam
Maths is the pure form of everything.
I disagree. Strongly. For several reasons.
<big snip>

the similey should've covered that line as well :)
Post by Robert de Vincy
Before I'm attacked by the Maths Mob, I'm not relegating maths to the
equivalent of some 2nd-rate photocopying machine. What I am saying is
that if you photocopy the Magna Carta on *THE* highest quality photocopier
so that even an expert couldn't tell one from the other just by looking
at them, which would you say is the REAL Magna Carta? Have you somehow
"captured" the essence of the Magna Carta and imbued some blank sheet of
paper with it so that you now have two actual Magna Cartas? (Or "Magnae
Cartae"?) No. You have one genuine article, and an abso-fucking-lutely
accurate copy that is still not the actual thing.
This is something I don't really understand, if two things are identical,
but one is "real", then why should it be more valuable?

It's like with famous paintings and fakes, if nobody can tell the real thing
from the fake, what difference does it make?

And why does the person who painted the picture make it more expensive than
the asthetics of the picture itself would warrent?

But that's a completely seperate thing.

And the first line is more philosophical. Like, if I bought a broom, and
one day years later I had to replace the bristles, then a decade later I
replaced the handle, is it the same broom I started off with? Or something
equvilient to buying a new broom? Is there an 'essence' to object which
goes beyond the molecules which make it up? Is there a soul?

Francis Crick didn't think so. He died today. Or it might have been
yesterday.
Post by Robert de Vincy
Post by Adam
But I suppose my point is that in maths, it is most obvious and all
stated most formally.
In science people can get away with post hoc ergo propter hoc,
They can?!?
I meant in GCSE/A-level project work, sorry, I should have been more clear
here...
Post by Robert de Vincy
Post by Adam
and it is often used in extremeist political arguments,
Valid ones, though? Ones that actually stand up to immense scrutiny?
No, but they seem to convince many uninformed people, people who haven't
been taught to question and find strength and flaws in arguments. (I keep
getting yelled at in newsgroups for calling people 'stupid' so I'm being a
bit more PC here.)
Post by Robert de Vincy
Post by Adam
but in maths it simply isn't possible to use that in a proof. You wouldn't
get marks for "observation" or "spelling and grammar".
"marks"? Are you talking exams here? Sure, you don't get marks in an
exam for observation. There just isn't the time! You can't take a few
hours out and explore something until you see a pattern or some relevant
data emerging. Exams are horribly artificial in that you're force-fed
selected data and that's usually all you have to work with. Ugh.
As for "marks for" "spelling and grammar", I'm not sure what you mean
by that, exactly. If you write the most penetrating analysis of, say,
1980s Communist Russia and how the foreign policy of that decade led to
the country's break-up, does it really affect your argument if you
misspell "Andropov"?
With GCSE/A-level work, in coursework projects, you get marks for
"observations" (i.e. writing down results and putting them in a table) and
you get marked on your correct use of spelling.

adam
Robert de Vincy
2004-07-31 00:01:24 UTC
Permalink
Post by Adam
It, of course, isn't, but the whole point of this thread is me trying
to convince you that it might be and you trying to convince me
otherwise.
What we need to do, is take two identical people, teach one person
maths, and don't teach the other person maths, and watch them
throughout their life, and see, at the age of 20, whether one person
is noticeably cleverer than the other.
That wouldn't be a controlled experiment, not by a long way. To get it
nearer to perfection, each person would have to be identical. Genetically
identical, since it's genes that initially stipulate the body that will
be produced (including, I firmly believe, the person). And then the
environment would have to be identical too. Not just living in the
same house and going to the same school, etc, but 100% identical.
Environment is absolutely crucial to how genes get activated and what
phonetypic effects they bring about. Genetically identical is not
impossible (maybe today, but probably not in the future), but getting
each to experience the same environment apart from the exposure to maths?
That would be a challenge and a half!
Post by Adam
However that isn't quite possible. So what else can we do? We can
look for evidence that mathematical thinking in helping with 'real
world' things. Not just adding up or anything to do with numbers at
all, but general problem solving. Like putting a flat-pack bed
together, or ironing a shirt. I believe there is a lot of maths in
common sense.
Hmm, is this a Get Out Clause? Where "maths" is actually just "common
sense"? I mean, if it is, then I hope someone will say so explicitly
and things will seem so much clearer (to me, at least!).
Post by Adam
Post by Robert de Vincy
Before I'm attacked by the Maths Mob, I'm not relegating maths to the
equivalent of some 2nd-rate photocopying machine. What I am saying
is that if you photocopy the Magna Carta on *THE* highest quality
photocopier so that even an expert couldn't tell one from the other
just by looking at them, which would you say is the REAL Magna Carta?
Have you somehow "captured" the essence of the Magna Carta and
imbued some blank sheet of paper with it so that you now have two
actual Magna Cartas? (Or "Magnae Cartae"?) No. You have one
genuine article, and an abso-fucking-lutely accurate copy that is
still not the actual thing.
This is something I don't really understand, if two things are
identical, but one is "real", then why should it be more valuable?
Notice that I didn't attach any notion of "value" in my example.

Quick thought experiment:
You manage to do the above. You've got the most bestestest photocopier
machine in the world EVER and you've also borrowed one of the still-
existing Magna Cartas.
You put it through the machine.
Just as you're admiring your handiwork, someone walks into the room and
asks you "Which one of those is the real Magna Carta?"
Being a truthful person who wouldn't want to lie to another good and honest
person, which one would you point to automatically without using any
guile or deceit?
Honestly?
Post by Adam
It's like with famous paintings and fakes, if nobody can tell the real
thing from the fake, what difference does it make?
Difference to whom?

If you're bringing people's opinions into the matter, then all bets are
off for a sure answer. But from an outside, objective viewpoint, there
is only one that is the genuine article.
Post by Adam
And why does the person who painted the picture make it more expensive
than the asthetics of the picture itself would warrent?
I was really discussing value and value-linked-to-authenticity, so I
haven't really thought about that.
Post by Adam
But that's a completely seperate thing.
I agree.
Post by Adam
And the first line is more philosophical. Like, if I bought a broom,
and one day years later I had to replace the bristles, then a decade
later I replaced the handle, is it the same broom I started off with?
Or something equvilient to buying a new broom?
Once again, that depends on whom you ask. But from a detached viewpoint,
the original broom no longer exists. All you've got are two halves of
a broom joined together, with one of those pieces being joined to a bit
of your original broom some time ago.
Post by Adam
Is there an 'essence' to object which goes beyond the molecules which
make it up?
I don't think so, no. That's getting scarily into "homeopathy really works!"
territory.
Post by Adam
Is there a soul?
Absolutely not.

But others will disagree, of course!
Post by Adam
Francis Crick didn't think so. He died today. Or it might have been
yesterday.
Yesterday. Someone posted a message on talk.origins just after it
happened.
Post by Adam
With GCSE/A-level work, in coursework projects, you get marks for
"observations" (i.e. writing down results and putting them in a table)
and you get marked on your correct use of spelling.
In theory, that shouldn't happen, but a part of me is cheering on the
encouragement given to accurate spelling!
--
BdeV
Robert de Vincy
2004-07-31 00:15:28 UTC
Permalink
Post by Robert de Vincy
phonetypic
Argh! Phenotypic!

Of all the number of times I've typed that in the past few months, it's
now that I make a typo when I had the best chance of impressing the
socks off everyone at an unusual word!
--
BdeV
Adam
2004-07-31 01:05:41 UTC
Permalink
Post by Robert de Vincy
Post by Adam
It, of course, isn't, but the whole point of this thread is me trying
to convince you that it might be and you trying to convince me
otherwise.
What we need to do, is take two identical people, teach one person
maths, and don't teach the other person maths, and watch them
throughout their life, and see, at the age of 20, whether one person
is noticeably cleverer than the other.
That wouldn't be a controlled experiment, not by a long way. To get it
nearer to perfection, each person would have to be identical. Genetically
identical, since it's genes that initially stipulate the body that will
be produced (including, I firmly believe, the person). And then the
environment would have to be identical too. Not just living in the
same house and going to the same school, etc, but 100% identical.
Environment is absolutely crucial to how genes get activated and what
phonetypic effects they bring about. Genetically identical is not
impossible (maybe today, but probably not in the future), but getting
each to experience the same environment apart from the exposure to maths?
That would be a challenge and a half!
That's exactly what I meant when I said "two identical people", and it's why
I started the next paragraph with "However that isn't quite possible". :)
Post by Robert de Vincy
Post by Adam
However that isn't quite possible. So what else can we do? We can
look for evidence that mathematical thinking in helping with 'real
world' things. Not just adding up or anything to do with numbers at
all, but general problem solving. Like putting a flat-pack bed
together, or ironing a shirt. I believe there is a lot of maths in
common sense.
Hmm, is this a Get Out Clause? Where "maths" is actually just "common
sense"? I mean, if it is, then I hope someone will say so explicitly
and things will seem so much clearer (to me, at least!).
I think there is a lot of maths in common sense, and that someone who is
better at maths is gonna have 'more common sense'.

Then again, I'm pretty good at maths, so I'm not exactly unbiased :)
Post by Robert de Vincy
Post by Adam
Post by Robert de Vincy
Before I'm attacked by the Maths Mob, I'm not relegating maths to the
equivalent of some 2nd-rate photocopying machine. What I am saying
is that if you photocopy the Magna Carta on *THE* highest quality
photocopier so that even an expert couldn't tell one from the other
just by looking at them, which would you say is the REAL Magna Carta?
Have you somehow "captured" the essence of the Magna Carta and
imbued some blank sheet of paper with it so that you now have two
actual Magna Cartas? (Or "Magnae Cartae"?) No. You have one
genuine article, and an abso-fucking-lutely accurate copy that is
still not the actual thing.
This is something I don't really understand, if two things are
identical, but one is "real", then why should it be more valuable?
Notice that I didn't attach any notion of "value" in my example.
You manage to do the above. You've got the most bestestest photocopier
machine in the world EVER and you've also borrowed one of the still-
existing Magna Cartas.
You put it through the machine.
Just as you're admiring your handiwork, someone walks into the room and
asks you "Which one of those is the real Magna Carta?"
Being a truthful person who wouldn't want to lie to another good and honest
person, which one would you point to automatically without using any
guile or deceit?
Honestly?
But isn't that only because you humans attach some weird sentimental
property to one particular bunch of atoms over another?

If someone was to jumble the two copies up, is the 'real' one lost forever?

If so, what difference does it make to anything?

Why can't we just destroy one? We are still left with someone which is
identical in every possible respect to the 'real' one.
Post by Robert de Vincy
Post by Adam
It's like with famous paintings and fakes, if nobody can tell the real
thing from the fake, what difference does it make?
Difference to whom?
If you're bringing people's opinions into the matter, then all bets are
off for a sure answer. But from an outside, objective viewpoint, there
is only one that is the genuine article.
If the two are *identical* as above, then from an outside, objective
viewpoint, they are both identical. There is no difference between them, as
that was the definition for the creation of the second one.
Post by Robert de Vincy
Post by Adam
Is there an 'essence' to object which goes beyond the molecules which
make it up?
I don't think so, no. That's getting scarily into "homeopathy really works!"
territory.
So, there is no difference between a 'real' set of molecules which make up
the magna-carta, and a second set of identical molecules which has been put
together in the same arranagment so as to make a magna-carta-2.

No physical difference whatsoever.

The only difference is therefore 'mental', or 'sentimental'.

Heh, I've only now just realised the similarity of the words mental and
sentimental.

adam
Robert de Vincy
2004-07-31 02:24:09 UTC
Permalink
Post by Adam
Post by Robert de Vincy
You manage to do the above. You've got the most bestestest
photocopier machine in the world EVER and you've also borrowed one of
the still- existing Magna Cartas.
You put it through the machine.
Just as you're admiring your handiwork, someone walks into the room
and asks you "Which one of those is the real Magna Carta?"
Being a truthful person who wouldn't want to lie to another good and
honest person, which one would you point to automatically without using
any guile or deceit?
Honestly?
But isn't that only because you humans attach some weird sentimental
property to one particular bunch of atoms over another?
Some do, I guess. But I wasn't arguing from that position.

It's not like we took two identical things and arbitrarily assigned one
of them the label "GENUINE!" and the other "FAKE!" A stranger, not
knowing which one was a copy and which was genuine, probably wouldn't
know one from the other, true. But neither his knowledge, nor your
knowledge, nor anyone's knowledge would affect the fact that one really
is a copy and the other really is genuine.
Post by Adam
If someone was to jumble the two copies up, is the 'real' one lost forever?
No. Well, maybe to humankind, but the original would still be there,
wouldn't it? It's not like you destroyed it or anything. You would just
have two artifacts, and you wouldn't know which was genuine and which
was a copy. The problem that would plague investigators for years after
would be: "Which is the real one? We know one of them is!" That doesn't
mean it's lost on some absolute scale, but it would be "lost" if you
looked at it purely via "How much do humans currently know at this
present moment?" It all depends on what perceptive filter (if any) you use
to view the situation; if you pick up the "Sum total of current human
knowledge" filter and try to see the situation through that, then your view
will be clouded by the limits of the "sum total of current human knowledge".
Post by Adam
If so, what difference does it make to anything?
Why can't we just destroy one? We are still left with someone which
is identical in every possible respect to the 'real' one.
Oh, sure, it's identical. Everyone would be fooled (we assume). It would
make not a jot of difference to humankind to carry on with the false
idea that the copy was the genuine thing. But if we, once again, stop
looking at it from a "What does humankind know about this thing?" viewpoint
then we can see that the original has been destroyed. We (Usenet readers,
here in 2004, discussing this hypothetical situation) are lucky because
we have created this situation and so we know more than the people who
are living in the hypothetical world we've created. This allows us to
discard the manufactured "What does humankind know about this thing?"
viewpoint and use our own, which -- because of the sequence of events
that lead up to this moment -- does include the necessary information to
enable us to identify the copy.
If you believe in a Supreme Being (God, Allah, whoever) then maybe you
could view that as being able to see the situation and know the copy is
a copy.
If you don't believe in such a thing then tough shit. We've found a
limit to our abilities. Humans aren't omniscient. All we can do is
hope that a Supreme Being consciousness exists that exceeds our limits
and decides to show us the truth, or that we discover a method to expand
what we are capable of knowing in this particular area.
Post by Adam
Post by Robert de Vincy
Difference to whom?
If you're bringing people's opinions into the matter, then all bets
are off for a sure answer. But from an outside, objective viewpoint,
there is only one that is the genuine article.
If the two are *identical* as above, then from an outside, objective
viewpoint, they are both identical. There is no difference between
them, as that was the definition for the creation of the second one.
Ah, yes. But our ability to acquire information about something is
limited. It's not like we've progressed to a point where someone's
announced, "Right, that it's, we are able to know anything and everything
that we ever wanted to know!" There is always a limit to what we do
know (collectively, in this case) and how far our current ability and
methods can take us. The consequence of, say, not being able to travel
back in time to follow the paths of the genuine and the copy is that we
can't use that method to increase our knowledge. What other way could
we conceivably use? I don't know. Whatever method might work is either
currently beyond our ability to be conceived ("cognitive closure", as
people such as, e.g., Colin McGinn call it) or can be conceived but is
impossible to be made real with our current technology or because of some
impossiblity inherent in the method (even though we CAN think of it).
Post by Adam
Post by Robert de Vincy
I don't think so, no. That's getting scarily into "homeopathy really
works!" territory.
So, there is no difference between a 'real' set of molecules which make up
the magna-carta, and a second set of identical molecules which has been put
together in the same arranagment so as to make a magna-carta-2.
No physical difference whatsoever.
As we view it, assuming that "we" are inhabiting this hypothetical
world... yes. No difference that we can discern.
Post by Adam
The only difference is therefore 'mental', or 'sentimental'.
Yeah, people will have the idea floating about in their minds that one
is genuine and one is fake, but that's all it can ever be at this stage
of knowledge-finding.

Imagine three sets, and define them as:
A. all the facts and knowledge we possess and have immediate access to
now;
B. all the facts and knowledge that our current usable methods of fact-
finding and knowledge-finding are capable of acquiring now;
C. all the facts and knowledge possible.

A exists inside B which exists inside C. A set can only grow as large as
the one it is inside and only as small as the one that is inside it.

Fact F1 "One of these is fake, the other is genuine" occupies a position
in A.
Fact F2 "This one is fake and this other one is genuine" occupies a
position in C.

Since F2 lies outside B, none of the things we have can reach it because
of B's border (the limit of what we can currently learn). Therefore, we
need to expand B's borders.

How to expand B so that it subsumes F2? Well, we need to devise a new
fact-finding method. Clearly, the tools we have at our disposal right now
are useless for the task.

The important point is that F2 exists, somewhere out in the wilderness
of exclusive-C-space.
Post by Adam
Heh, I've only now just realised the similarity of the words mental
and sentimental.
You win a prize!
--
BdeV
Ray Pang
2004-07-31 09:24:27 UTC
Permalink
Post by Robert de Vincy
Post by Adam
Post by Robert de Vincy
You manage to do the above. You've got the most bestestest
photocopier machine in the world EVER and you've also borrowed one of
the still- existing Magna Cartas.
You put it through the machine.
Just as you're admiring your handiwork, someone walks into the room
and asks you "Which one of those is the real Magna Carta?"
Being a truthful person who wouldn't want to lie to another good and
honest person, which one would you point to automatically without using
any guile or deceit?
Honestly?
But isn't that only because you humans attach some weird sentimental
property to one particular bunch of atoms over another?
Some do, I guess. But I wasn't arguing from that position.
It's not like we took two identical things and arbitrarily assigned one
of them the label "GENUINE!" and the other "FAKE!" A stranger, not
knowing which one was a copy and which was genuine, probably wouldn't
know one from the other, true. But neither his knowledge, nor your
knowledge, nor anyone's knowledge would affect the fact that one really
is a copy and the other really is genuine.
Post by Adam
If someone was to jumble the two copies up, is the 'real' one lost forever?
No. Well, maybe to humankind, but the original would still be there,
wouldn't it? It's not like you destroyed it or anything.
You would just
have two artifacts, and you wouldn't know which was genuine and which
was a copy. The problem that would plague investigators for years after
would be: "Which is the real one? We know one of them is!"
Post by Adam
If so, what difference does it make to anything?
I still can't see your viewpoint. If one is an atom for atom copy of the
other, then they are indistinguishable, not only by humans, but by
everything, barring the existence of a God-like being which "just knows they
are different". If they are indistinguishable, then they are the same, so
one cannot have a notion of realness attached to it which the other doesn't.
Now this leads me to conclude that the whole idea of an atom for atom
indistinguishable copy would be possiblem because there MUST be a parameter
which makes them different, e.g. position, time of existence, etc. Otherwise
the two things would be in the same position, at the same time, occupying
the same space, with the same atoms in the same configuration. They are one
and the same, in the truest sense.
Post by Robert de Vincy
A. all the facts and knowledge we possess and have immediate access to
now;
B. all the facts and knowledge that our current usable methods of fact-
finding and knowledge-finding are capable of acquiring now;
C. all the facts and knowledge possible.
A exists inside B which exists inside C. A set can only grow as large as
the one it is inside and only as small as the one that is inside it.
Fact F1 "One of these is fake, the other is genuine" occupies a position
in A.
Fact F2 "This one is fake and this other one is genuine" occupies a
position in C.
Since F2 lies outside B, none of the things we have can reach it because
of B's border (the limit of what we can currently learn). Therefore, we
need to expand B's borders.
How to expand B so that it subsumes F2? Well, we need to devise a new
fact-finding method. Clearly, the tools we have at our disposal right now
are useless for the task.
Why isn't it the case that B=C?
Robert de Vincy
2004-07-31 10:06:11 UTC
Permalink
Post by Ray Pang
Post by Robert de Vincy
A. all the facts and knowledge we possess and have immediate access
to now;
B. all the facts and knowledge that our current usable methods of
fact-finding and knowledge-finding are capable of acquiring now;
C. all the facts and knowledge possible.
A exists inside B which exists inside C. A set can only grow as
large as the one it is inside and only as small as the one that is
inside it.
Fact F1 "One of these is fake, the other is genuine" occupies a
position in A.
Fact F2 "This one is fake and this other one is genuine" occupies a
position in C.
Since F2 lies outside B, none of the things we have can reach it
because of B's border (the limit of what we can currently learn).
Therefore, we need to expand B's borders.
How to expand B so that it subsumes F2? Well, we need to devise a
new fact-finding method. Clearly, the tools we have at our disposal
right now are useless for the task.
Why isn't it the case that B=C?
Because we can't tell the copy from the genuine.

One -- we know for absolute certain -- is genuine. The other -- we also
know with absolute certainty -- is a copy. This is covered by F1.

But we have no way of being able to tell which was the result of our
highly impressive copying machine and which was created in the 13th
century. Therefore, there must be a discrepancy between what knowledge
may possibily be known (set C) and what our technology and methods are
capable of knowing at this moment (set B).

To satisfy B=C, we would have to be capable -- right now at this moment
using all or any of the currently extant methods and technology -- of
finding out *ANY* fact. Anything. But just think about all the mysteries
that are still unsolved despite people throwing every piece of reasoning
and investigation at them. And we've created our own Unsolvable Mystery
right here: which is the copy, which is genuine? Nothing we have now can
answer this question, but it MIGHT be answered someday, yes?

And if you believe that humans have "cognitive closure" then C itself
will be inside its own system, with the hypothetical set D being all
the knowledge that we (in our current stage of development) are not capable
of thinking about right now (it's next-to-impossible to give an example
here, since to give an example would be to conceive of that idea and then
by virtue of my being able to conceive of it it falls into C and not D),
and set E being a sort of universal set that contains the whole system.
(There is probably an intermediate set between D and E, but I find it a
tricky thing to put into words that don't lead to a confusing jumble.)
Imagine redesigning the set definitions above for a spider.
Fact F3 = "How much return I would get from £10,000 held in a savings
account over a year at 4% APR." A spider would never be able to even
conceive of such a thing. F3 (for a spider, and all other cognitive
creatures we know about except humans) would lie not just outside B but
outside C, too. But it *WOULD* have to exist somewhere in the Universal
Set of Knowledge because humans are capable of possessing that knowledge.
Maybe there are sets D, E, etc for us, too? Maybe F2 lies in there?
But this is somewhat controversial and not universally accepted. Maybe
for humans, C=(D,E,etc) in comparison to every other species where C<(D,E,etc),
but we have no way of testing this.
--
BdeV
Robert de Vincy
2004-07-31 10:11:59 UTC
Permalink
Maybe F2 lies in [set D]?
Idiot! Of course it doesn't!

10 points for anyone who can tell me why.
15 points for anyone who can tell me why I wrote such a silly thing.
--
BdeV
Adam
2004-07-31 11:29:08 UTC
Permalink
Post by Robert de Vincy
Maybe F2 lies in [set D]?
Idiot! Of course it doesn't!
10 points for anyone who can tell me why.
Because we can think about it. We can understand the question?
Post by Robert de Vincy
15 points for anyone who can tell me why I wrote such a silly thing.
Maybe you were thinking it will forever be impossible to determine which is
fake and which is real.

But that seems unlikely, since you already proposed time travel, unless you
believe time travel is fundamentally impossible?

But that too seems unlikely.

So perhaps you were thinking about the spider?

adam
Robert de Vincy
2004-07-31 12:22:21 UTC
Permalink
Post by Adam
Post by Robert de Vincy
Maybe F2 lies in [set D]?
Idiot! Of course it doesn't!
10 points for anyone who can tell me why.
Because we can think about it. We can understand the question?
Correct. Simply, if we can define the fact, it automatically has
membership of C, before we even begin to find out if we already possess
that knowledge.
Post by Adam
Post by Robert de Vincy
15 points for anyone who can tell me why I wrote such a silly thing.
Maybe you were thinking it will forever be impossible to determine
which is fake and which is real.
But that seems unlikely, since you already proposed time travel,
unless you believe time travel is fundamentally impossible?
That wouldn't put the fact beyond C. The lack of a working time-machine
merely blocks off one possible way of expanding B (and, ultimately, A).
We still know that F2 is out there floating about in C-space somewhere
(we've defined it, remember?) so time-machine or no time-machine, it
remains within the boundaries of inclusive-C.
Post by Adam
But that too seems unlikely.
So perhaps you were thinking about the spider?
The correct answer is: because I need sleep.
--
BdeV
Adam
2004-07-31 11:39:50 UTC
Permalink
Post by Robert de Vincy
Post by Ray Pang
Post by Robert de Vincy
A. all the facts and knowledge we possess and have immediate access
to now;
B. all the facts and knowledge that our current usable methods of
fact-finding and knowledge-finding are capable of acquiring now;
C. all the facts and knowledge possible.
A exists inside B which exists inside C. A set can only grow as
large as the one it is inside and only as small as the one that is
inside it.
Fact F1 "One of these is fake, the other is genuine" occupies a
position in A.
Fact F2 "This one is fake and this other one is genuine" occupies a
position in C.
Since F2 lies outside B, none of the things we have can reach it
because of B's border (the limit of what we can currently learn).
Therefore, we need to expand B's borders.
How to expand B so that it subsumes F2? Well, we need to devise a
new fact-finding method. Clearly, the tools we have at our disposal
right now are useless for the task.
Why isn't it the case that B=C?
Because we can't tell the copy from the genuine.
One -- we know for absolute certain -- is genuine. The other -- we also
know with absolute certainty -- is a copy. This is covered by F1.
But we have no way of being able to tell which was the result of our
highly impressive copying machine and which was created in the 13th
century. Therefore, there must be a discrepancy between what knowledge
may possibily be known (set C) and what our technology and methods are
capable of knowing at this moment (set B).
To satisfy B=C, we would have to be capable -- right now at this moment
using all or any of the currently extant methods and technology -- of
finding out *ANY* fact. Anything. But just think about all the mysteries
that are still unsolved despite people throwing every piece of reasoning
and investigation at them. And we've created our own Unsolvable Mystery
right here: which is the copy, which is genuine? Nothing we have now can
answer this question, but it MIGHT be answered someday, yes?
And if you believe that humans have "cognitive closure" then C itself
will be inside its own system, with the hypothetical set D being all
the knowledge that we (in our current stage of development) are not capable
of thinking about right now (it's next-to-impossible to give an example
here, since to give an example would be to conceive of that idea and then
by virtue of my being able to conceive of it it falls into C and not D),
and set E being a sort of universal set that contains the whole system.
(There is probably an intermediate set between D and E, but I find it a
tricky thing to put into words that don't lead to a confusing jumble.)
Imagine redesigning the set definitions above for a spider.
Fact F3 = "How much return I would get from £10,000 held in a savings
account over a year at 4% APR." A spider would never be able to even
conceive of such a thing. F3 (for a spider, and all other cognitive
creatures we know about except humans) would lie not just outside B but
outside C, too. But it *WOULD* have to exist somewhere in the Universal
Set of Knowledge because humans are capable of possessing that knowledge.
Maybe there are sets D, E, etc for us, too? Maybe F2 lies in there?
But this is somewhat controversial and not universally accepted. Maybe
for humans, C=(D,E,etc) in comparison to every other species where C<(D,E,etc),
but we have no way of testing this.
This stuff is really interesting and I shall tell you why.

It has interested me for a long time, the way human knowledge can just be
destroyed when a person dies. It comes up frequently in court cases, where
only the murderer and the victim knows the identity of the murderer, and if
the murderer dies then that information is lost forever (and a few more
special circmstances to ensure that asking where *everyone* else is won't
help).

Perhaps a better example is motivation, why, for example, people go on
random shooting sprees and then kill themselves. Or perhaps just really old
mysteries. And things like "what did Aristotle have for lunch on his 15th
birthday?". Information like that can never be retrieved, and that bothered
me.

Unless you believe time travel is possible, but that opens up a whole can of
worms, cos then you have the problem of affecting the past (which is
inevitable according to Heisenburg whenever you observe anything, or can we
one day get around that?).

So anyway, thanks for informing me that this kinda thing bothers other
people as well! ;-)


Oh yeah, why doesn't D include C? You said D was everything we are not
capable of thinking about right now. But normally when talking about sets
like this don't we continue expanding them, and then say D-not-C is the set
of everything we are not capable of thinking about right now?

And also, I'm not sure if I understand what you mean by set E, the set of
everything we will never understand? How can there be such a set? You'd
have to believe that there is a limit to human knowledge. Such a question
is surley in set C and thus has no answer, so you can't be sure set E
exists?

(this is fun)

adam
Robert de Vincy
2004-07-31 12:15:38 UTC
Permalink
Post by Adam
Post by Robert de Vincy
Because we can't tell the copy from the genuine.
One -- we know for absolute certain -- is genuine. The other -- we
also know with absolute certainty -- is a copy. This is covered by
F1.
But we have no way of being able to tell which was the result of our
highly impressive copying machine and which was created in the 13th
century. Therefore, there must be a discrepancy between what
knowledge may possibily be known (set C) and what our technology and
methods are capable of knowing at this moment (set B).
To satisfy B=C, we would have to be capable -- right now at this
moment using all or any of the currently extant methods and
technology -- of finding out *ANY* fact. Anything. But just think
about all the mysteries that are still unsolved despite people
throwing every piece of reasoning and investigation at them. And
we've created our own Unsolvable Mystery right here: which is the
copy, which is genuine? Nothing we have now can answer this
question, but it MIGHT be answered someday, yes?
And if you believe that humans have "cognitive closure" then C itself
will be inside its own system, with the hypothetical set D being all
the knowledge that we (in our current stage of development) are not
capable of thinking about right now (it's next-to-impossible to give
an example here, since to give an example would be to conceive of
that idea and then by virtue of my being able to conceive of it it
falls into C and not D), and set E being a sort of universal set that
contains the whole system. (There is probably an intermediate set
between D and E, but I find it a tricky thing to put into words that
don't lead to a confusing jumble.) Imagine redesigning the set
definitions above for a spider. Fact F3 = "How much return I would
get from £10,000 held in a savings account over a year at 4% APR." A
spider would never be able to even conceive of such a thing. F3 (for
a spider, and all other cognitive creatures we know about except
humans) would lie not just outside B but outside C, too. But it
*WOULD* have to exist somewhere in the Universal Set of Knowledge
because humans are capable of possessing that knowledge. Maybe there
are sets D, E, etc for us, too? Maybe F2 lies in there? But this is
somewhat controversial and not universally accepted. Maybe for
humans, C=(D,E,etc) in comparison to every other species where C<(D,E,
etc), but we have no way of testing this.
This stuff is really interesting and I shall tell you why.
It has interested me for a long time, the way human knowledge can just
be destroyed when a person dies. It comes up frequently in court
cases, where only the murderer and the victim knows the identity of
the murderer, and if the murderer dies then that information is lost
forever (and a few more special circmstances to ensure that asking
where *everyone* else is won't help).
Perhaps a better example is motivation, why, for example, people go on
random shooting sprees and then kill themselves. Or perhaps just
really old mysteries. And things like "what did Aristotle have for
lunch on his 15th birthday?". Information like that can never be
retrieved, and that bothered me.
Yep. Even though our A keeps on expanding in some areas (we're always
learning more stuff, as a collective whole), some other stuff that it
is possible to know (simply by going up to a person when he's alive and
asking him what he had for lunch on his 15th birthday) slips out of B
and into C. Written language was a great leap forward in stopping a
constant migration of facts from B into C, as would be...
Post by Adam
Unless you believe time travel is possible, but that opens up a whole
can of worms, cos then you have the problem of affecting the past
(which is inevitable according to Heisenburg whenever you observe
anything, or can we one day get around that?).
At the moment time travel became possible, there would be an enormous
leap of facts from C into B, and then -- as the time travel machine was
used -- from B into A.

The whole paradox thing of time-travel is, as you say, a whole bunch of
worms from a different can that I haven't even begun to think about or
get a grip on.
Post by Adam
So anyway, thanks for informing me that this kinda thing bothers other
people as well! ;-)
Oh yeah, why doesn't D include C? You said D was everything we are
not capable of thinking about right now. But normally when talking
about sets like this don't we continue expanding them, and then say
D-not-C is the set of everything we are not capable of thinking about
right now?
Ah, perhaps my definition of D was not elaborate enough. You're right
that D sorta includes C in one way (so that we can imagine our "expanding
area" analogy properly), but it doesn't in another way, because it implies
that C (stuff possible to know) is a perfect subset of D (stuff that isn't
possible to know). Each definition, as you go up each level, is cumulative
in that it must inherit the properties of all that came before it *AND*
the properties that are unique to that set. I tried to keep it simple.
Maybe it was too simple!
Post by Adam
And also, I'm not sure if I understand what you mean by set E, the set
of everything we will never understand?
No. The set of every fact. A Universal Set of All Possible Knowledge.
Even the things we can't (yet?) conceivably think about and, possibly,
will never be able to think about. We (stuck in our small, human world-
view) can't even begin to imagine what exclusive-E would contain. Unless,
of course, you discount cognitive closure and believe that there is nothing
in the whole of reality that we can't possibly think of, then a whole
lot of those sets would cease to be different (that is, anything from D
onwards will be the same as C).
Post by Adam
How can there be such a set? You'd have to believe that there is a limit
to human knowledge. Such a question is surley in set C and thus has no
answer, so you can't be sure set E exists?
Exactly. Welcome to the debate about cognitive closure! We can see
that such a limit exists for other ("lower"?) creatures, but because we
can't take a step beyond our own thinking and see the bigger picture as
it relates to us (as we can to, say, a spider) then there's no way of
being absolutely certain that we do have an upper limit.
If we look at the rest of the cognitive creatures we know about, we can
see that they all have limits, they are all incapable of knowing or
being aware of things that we know exist. So, with our being just yet
another cognitive earthly creature, then surely we must have such a limit,
too, yeah?
However, if we consider ourselves as special or set apart from the rest
of creation on account of our consciousness (or soul or being chosen by
God or whatever) then it seems not impossible that we have none of those
limits we see in other creatures.
But it's hard to test, as I said. As soon as we think of something as
an example of existing beyond our C boundary then -- pop! -- it magically
appears inside our C boundary!
Post by Adam
(this is fun)
Not when you've been up all night and still haven't had breakfast at 1pm
the next day.
--
BdeV
Dr A. N. Walker
2004-08-02 12:35:53 UTC
Permalink
[...] What I am saying is
that if you photocopy the Magna Carta on *THE* highest quality photocopier
[do] you now have two actual Magna Cartas? (Or "Magnae
Cartae"?) No.
When you invented this example, were you intending to make
something of the fact that there are already in existence four actual
MCs? Or is that too subtle?
[...] If you write the most penetrating analysis of, say,
1980s Communist Russia and how the foreign policy of that decade led to
the country's break-up, does it really affect your argument if you
misspell "Andropov"?
Am I seeing more gremlins where none were intended? If you
mis-spell "Andropov" as "Brezhnev", it could be very serious. OTOH,
"Andropov" is not an English word anyway, and the more sensible
question is not about its spelling but about its correct or otherwise
transliteration from the Cyrillic. For "Andropov", this is [AFAIK]
scarcely an issue; for "Khruschev", "Tschaikowsky" or "Niemtsowitsch"
it surely is.
--
Andy Walker, School of MathSci., Univ. of Nott'm, UK.
***@maths.nott.ac.uk
Robert de Vincy
2004-08-02 14:31:24 UTC
Permalink
Post by Dr A. N. Walker
[...] What I am saying is
that if you photocopy the Magna Carta on *THE* highest quality photocopier
[do] you now have two actual Magna Cartas? (Or "Magnae
Cartae"?) No.
When you invented this example, were you intending to make
something of the fact that there are already in existence four actual
MCs? Or is that too subtle?
I do realize there are "copies" (that were all made at roughly same time, if I
remember correctly -- the exhibition thingy in Lincoln Castle where one of
those "copies" lives was quite informative last time I was there), and I think
one of my other messages hints at using one of the existing versions in
acknowledgement that there are more than one.

The criterion I was using in all my messages was that the "copy" be created
there and then by the copying machine, with the "original" being the source of
that copy. For the purposes of the argument, I'm not sure how the existence
of other artifacts that resemble my "original" (even though they would be less
perfect a copy than the machine "copy") affects the point at all.
Post by Dr A. N. Walker
[...] If you write the most penetrating analysis of, say, 1980s
Communist Russia and how the foreign policy of that decade led to
the country's break-up, does it really affect your argument if you
misspell "Andropov"?
Am I seeing more gremlins where none were intended? If you
mis-spell "Andropov" as "Brezhnev", it could be very serious.
Is that actually a likely misspelling? You could use that excuse for any
'wrong' answer. "Oh, I meant to write 25 but I accidentally wrote 100
instead!"
Post by Dr A. N. Walker
OTOH, "Andropov" is not an English word anyway,
I disagree that it "is not an English word", since "an English word"
is such a loose and fluid concept.
In an English sentence, it behaves according to the rules of English
syntax, morphology, and (if spoken) phonology. The sequence of letters/
phonological segments behave according to all the rules that govern
other English words.

Let's say I have two brothers waiting to see me at work -- Sergei Andropov
and Nikolai Andropov. Would it be acceptable to say that I have to talk to
the two Andropovs? (By "acceptable", I mean "Would it be an understandable
thing to say in such a situation?") I think so.
I've just pluralized a noun using the English regular pluralizing rule and not
the Russian version.

If Mr Andropov had some specific character trait or method or something
characteristic to him, it would be perfectly valid to say "That's a very
Andropovian way of doing things!" "-ian" is a productive derivational
suffix within English to make an adjective from a noun. We wouldn't use
the Russian equivalent, would we?

You might now say something like "Ah, but it's not in The Dictionary!"
Well, I've just looked in the New Shorter OED and found "Nganasan" (a
transliterated word, just like "Andropov"). Is that "an English word"? Why
does that one become a valid English word and "Andropov" not share the honour?
Post by Dr A. N. Walker
and the more sensible question is not about its spelling but about its
correct or otherwise transliteration from the Cyrillic. For "Andropov",
this is [AFAIK] scarcely an issue; for "Khruschev", "Tschaikowsky" or
"Niemtsowitsch" it surely is.
Y-e-s... and would it matter if you wrote "Tchaikovski" instead?
--
BdeV
Dr A. N. Walker
2004-08-02 15:43:05 UTC
Permalink
I do realize there are "copies" [...]
Yes, I realise that "everyone knows" there are several MCs;
I just wondered whether that realisation was part of your motivation
for using that example, so that later in the thread you could pull
some rabbit out of the hat?
Post by Dr A. N. Walker
Am I seeing more gremlins where none were intended? If you
mis-spell "Andropov" as "Brezhnev", it could be very serious.
Is that actually a likely misspelling? You could use that excuse for any
'wrong' answer. "Oh, I meant to write 25 but I accidentally wrote 100
instead!"
This is not so different from what often happens in maths;
if you have fifty divided by a half, then among the general population
the answers 25 and 100 are roughly equiprobable ....
Post by Dr A. N. Walker
OTOH, "Andropov" is not an English word anyway,
I disagree that it "is not an English word", since "an English word"
is such a loose and fluid concept.
Only to you linguists. To the rest of us, we all know what
we mean.
In an English sentence, it behaves according to the rules of English
syntax, morphology, and (if spoken) phonology. The sequence of letters/
phonological segments behave according to all the rules that govern
other English words.
But that is because it is in an English sentence. The only
thing that distinguishes "Andropov" from "Quertleyoop" is that you
and some/many other readers know, or think you know, something about
Mr Andropov so that the word has a resonance. If you didn't already
know that I've just invented the Q-word, then I could almost certainly
have bamboozled you in some debate by claiming that the issue could be
resolved by applying Quertleyoopian theories, following Q's seminal
work in the 1970's on matroidal semi-tropes based on the blah, blah.
Indeed, I often read books/papers that could well be bamboozling me
in just such a way; how would I know? That doesn't make Q, or any
other random [but more-or-less pronounceable] sequence of letters an
English word.

[...]
You might now say something like "Ah, but it's not in The Dictionary!"
I might, but I wouldn't.
Well, I've just looked in the New Shorter OED and found "Nganasan" (a
transliterated word, just like "Andropov"). Is that "an English word"? Why
does that one become a valid English word and "Andropov" not share the honour?
As I have not the foggiest notion what "Nganasan" means, I
have no idea, but also doubt whether it has any claim to be an English
word. An English dictionary is not a definitive list of English words,
but rather a help to the reader who wants to know what the various words
that may occur in an English sentence mean. If and when "Nganasan" gets
to be common enough that many educated people *do* know what it means,
then it will have a claim on Englishness. Until then, it's an illegal
immigrant ....
Post by Dr A. N. Walker
and the more sensible question is not about its spelling but about its
correct or otherwise transliteration from the Cyrillic. For "Andropov",
this is [AFAIK] scarcely an issue; for "Khruschev", "Tschaikowsky" or
"Niemtsowitsch" it surely is.
Y-e-s... and would it matter if you wrote "Tchaikovski" instead?
In many cases, yes. For example, I have some examples of
Russian sheet music; "Tschaikowsky" sounds more like a sneeze than
the transliteration [according to *my* knowledge of Russian] of the
Cyrillic version of PIT's name, so if he had been less famous, I
might never have spotted that "my" composer and "yours" were the
same person. If you're trying to search a computerised database,
then Affabeck Lauder would never connect "Khru..." and "Xru..." or
"Tschai..." and "Chai"; and the various spellings of "Nimzovich"
are a nightmare for people trying to eliminate duplicates from their
megabases of chess games. You may also be aware that the different
spellings of the name of the Libyan leader are a well known test
case for regular expression analysers.
--
Andy Walker, School of MathSci., Univ. of Nott'm, UK.
***@maths.nott.ac.uk
Robert de Vincy
2004-08-02 18:42:49 UTC
Permalink
Post by Dr A. N. Walker
I do realize there are "copies" [...]
Yes, I realise that "everyone knows" there are several MCs;
I just wondered whether that realisation was part of your motivation
for using that example, so that later in the thread you could pull
some rabbit out of the hat?
No. That would be devilishly sly.
Post by Dr A. N. Walker
Post by Dr A. N. Walker
OTOH, "Andropov" is not an English word anyway,
I disagree that it "is not an English word", since "an English word"
is such a loose and fluid concept.
Only to you linguists. To the rest of us, we all know what
we mean.
In an English sentence, it behaves according to the rules of English
syntax, morphology, and (if spoken) phonology. The sequence of
letters/ phonological segments behave according to all the rules that
govern other English words.
But that is because it is in an English sentence.
The only thing that distinguishes "Andropov" from "Quertleyoop" is that
you and some/many other readers know, or think you know, something about
Mr Andropov so that the word has a resonance. If you didn't already
know that I've just invented the Q-word, then I could almost certainly
have bamboozled you in some debate by claiming that the issue could be
resolved by applying Quertleyoopian theories, following Q's seminal
work in the 1970's on matroidal semi-tropes based on the blah, blah.
Indeed, I often read books/papers that could well be bamboozling me
in just such a way; how would I know? That doesn't make Q, or any
other random [but more-or-less pronounceable] sequence of letters an
English word.
As soon as you take a sequence of letters and use it in an English sentence
(excluding 'mentions'), it must acquire Englishwordness or else it just
won't fit! You will, by default, have assigned it automatically to a word-
class (most obviously by its distribution, but also by any inflections you
might add by choice or necessity). You could object and reply "Oh, but
'kplkwl" is a sequence of letters; would that be a valid English word by
your standards?" Why not? We can incorporate initialisms into English
sentences and still claim they are English -- there are several strategies
for turning those initialisms into speakable sequences of phonetic segments,
picking the right one to fit in with English phonotactics. "Ah! But
initialisms are usually written in capital letters!" Um, in speech they're
not. Or if they are, we can't tell.

Which of the following are "English words"? [Remember: their presence or
absence from a dictionary is not a definitive test, for several reasons, a
few of which I list below.]
1. intravilate
2. eignes
3. fluviolacustrine
4. arbeful
5. rhabditiform
You just can't tell, especially not by looking at them in isolation, unless
you have already seen them used elsewhere.
Let's suppose that the preceeding discussion had not been about "Andropov"
and I throw in "6. andropov". Would you be able to leap on it and say
"Ah-hah! That's not an English word!" If you could -- if there is some clue
or some intuition or some divine inspiration telling you automatically that
this is not "an English word" -- then that same ability should let you
identify the words that I made up for the list above with something
approaching 100% accuracy.

We could look in a dictionary (or "The Dictionary" as people tend to say,
as if there were one ultimate authority) but that will surely not be a
conclusive test (unless we are narrowly defining "an English word" to be
"a word that is listed in Dictionary X" -- a bit like in competitive
Scrabble where only one dictionary contains legal words). The dictionary
we pick may omit our suspect word for reasons of:
- space;
- because it was compiled/published too long ago/too recently;
- editorial/compilers' policy;
- the word has been overlooked.


So, we can't rely on etymology for testing what is and what is not "an
English word". We can't rely entirely on pronounceability, since that
would discard initialisms that contravene English phonotactics yet we use
them every day without any feelings of "foreign"ness towards them. We
can't rely on a native-speaker's intuition. We can't rely on consulting
a dictionary. What can we use? How did you arrive at the decision that
"Andropov" is not an English word? Let me guess: "Heads it was, tails it
wasn't"?
Post by Dr A. N. Walker
Well, I've just looked in the New Shorter OED and found "Nganasan" (a
transliterated word, just like "Andropov"). Is that "an English
word"? Why does that one become a valid English word and "Andropov"
not share the honour?
As I have not the foggiest notion what "Nganasan" means, I
have no idea, but also doubt whether it has any claim to be an English
word. An English dictionary is not a definitive list of English
words, but rather a help to the reader who wants to know what the
various words that may occur in an English sentence mean. If and when
"Nganasan" gets to be common enough that many educated people *do*
know what it means, then it will have a claim on Englishness. Until
then, it's an illegal immigrant ....
Ah! So if enough educated people know the meaning of a word, then it
acquires Englishwordness? Okay.

But how many educated people would know what the non-made-up words in my
list above mean? What number are you setting for the quorum of
Englishwordness?

And this criterion would actually include "Andropov" into the élite of
Englishwordness, as I'm sure there are more people who know what "Andropov"
refers to than there are who know what, say, "skibbet" refers to.
Post by Dr A. N. Walker
Y-e-s... and would it matter if you wrote "Tchaikovski" instead?
In many cases, yes. For example, I have some examples of
Russian sheet music; "Tschaikowsky" sounds more like a sneeze than
the transliteration [according to *my* knowledge of Russian] of the
Cyrillic version of PIT's name, so if he had been less famous, I
might never have spotted that "my" composer and "yours" were the
same person. If you're trying to search a computerised database,
then Affabeck Lauder would never connect "Khru..." and "Xru..." or
"Tschai..." and "Chai"; and the various spellings of "Nimzovich"
are a nightmare for people trying to eliminate duplicates from their
megabases of chess games. You may also be aware that the different
spellings of the name of the Libyan leader are a well known test
case for regular expression analysers.
But the context that started this was a hypothetical misspelling in an
(A-level) essay.

Context is king.

When I mentioned misspelling "Andropov" not affecting an essay, I never
intended it to be a general "Well, you can misspell any word you like,
it dosn't mater."

If someone wrote "Brezhnev" instead of "Andropov", there is no justification
that this is a misspelling. Anyone who thinks that <B> can ever be an
prthogrpahical representation of [a] is clearly in trouble academically.
And that's just the first letter...
--
BdeV
Dr A. N. Walker
2004-08-03 13:28:50 UTC
Permalink
Post by Robert de Vincy
As soon as you take a sequence of letters and use it in an English sentence
(excluding 'mentions'), it must acquire Englishwordness or else it just
won't fit! [...]
Oh, nonsense. If I ask you to "pass the fromage", "fromage"
remains a French word and does not become English. *If* "bonjour"
and "fromage" and others become commonly used in English, in the
same way that "restaurant" did, and not just French words that all
educated people happen to know, *then* they will become naturalised
[and may well then change their pronunciation].
Post by Robert de Vincy
Which of the following are "English words"? [...]
None of them. If I, in my capacity as native Englishman with
relatively large passive vocabulary and experience of doing Ximenes,
don't recognise any of them, then they are not English. If, OTOH,
you can demonstrate that any competent geologist [say] quite normally
goes around saying "I must intravilate the eignes, or the arbeful
deposits of fluviolacustrine will become rhabditiform", then I am
open to persuasion.
Post by Robert de Vincy
[...] The dictionary
we pick may omit our suspect word for reasons of [...]
- the word has been overlooked.
While I'm here, can you explain why "puther" [as in "the
smoke puthered from the chimney"] is not in any of the dozen+
dictionaries I possess? "Everyone" knows the word. During WW2
there were "puther pots" to put up smoke screens. It's not the
same as "pother", which *is* in most of them.
Post by Robert de Vincy
[...] How did you arrive at the decision that
"Andropov" is not an English word? Let me guess: "Heads it was, tails it
wasn't"?
No, it's a proper name of someone moderately famous but who
does not yet, and probably never will, have an associated quality
[or whatever] worthy of being an English word, unlike [eg] Plato,
Euclid, Hoover, Churchill. He belongs in an encyclopaedia, not a
dictionary.
Post by Robert de Vincy
But how many educated people would know what the non-made-up words in my
list above mean? What number are you setting for the quorum of
Englishwordness?
Forty, obviously. No, there is no set rule. I don't know
whether, or to what extent, *you* are English [or British], nor even
to what extent you would, if sufficiently informed, count me as
English. As there are laws about being English that have great
practical consequence, we need set rules about people, but then
there are lots of very tough cases in the margins. We have no such
need for words, as there are no financial/legal consequences. It
doesn't *matter* that "Andropov" is not an English word, but that
"skibbet" is [if indeed it is].
Post by Robert de Vincy
And this criterion would actually include "Andropov" into the ^ilite of
Englishwordness, as I'm sure there are more people who know what "Andropov"
refers to than there are who know what, say, "skibbet" refers to.
Yes. But once you'd explained what the words meant and what
their etymology was and so on, then I'd wager that no-one would change
their minds about "Andropov", whereas you might perhaps find that many
would say "Ah yes, I never realised there was a good old English word
for the plastic things you shove into the corners of shirt collars,
I'll start to use it." And of course there is a huge grey area of
the technical words used in maths, geography, linguistics, etc., and
of regional dialect words, which may have quite limited numbers of
users but be invaluable to just those people.

[FWIW, I would like -- yes, I know it's sad, but at least it's
not (yet) a substitute for sex -- some good words for jigsaw puzzles.
As in "I'm looking for a piece with three outs, two normal but one
turned up clockwise, and an in, and that has a blue top left shoulder
and a yellow strand across the tab and that is slightly short on the
right". If everyone understood "... a blue-yellow type-seven snub
with right kurtosis", we'd get on much better.]

[Also a propos of nothing very much at all, there is a family
friend who was asked as a small boy at school what a picture was of,
and instead of the expected "lorry", took one glance and said "it's
an articulated six-wheel flatback with drop sides and a hoist", or
some such.]
--
Andy Walker, School of MathSci., Univ. of Nott'm, UK.
***@maths.nott.ac.uk
Robert de Vincy
2004-08-03 15:13:46 UTC
Permalink
Post by Dr A. N. Walker
Post by Robert de Vincy
As soon as you take a sequence of letters and use it in an English
sentence (excluding 'mentions'), it must acquire Englishwordness or
else it just won't fit! [...]
Oh, nonsense.
Not nonsense. It's true... it won't "fit". You've actually proved it
Post by Dr A. N. Walker
If I ask you to "pass the fromage", "fromage" remains a French word and
does not become English.
"Pass the fromage." "Fromage" has acquired an English determiner, lost
all traces of gender, and the sentence could equally apply to several
varieties of cheese as it could to one type of cheese (exhibiting the
countable/non-countable attribute of English "cheese"). And that's
just a few things it has actually acquired. If it were, say, the Russian
word for "cheese" that you used instead of the French one then it would
have lost its case inflection.
And then there's the pronunciation. Which English speakers would say
"Pass the fromage" with an uvular <r> in "fromage"?

For something that you say "does not become English" it's certainly doing
a bad job of retaining its Foreignwordness.
Post by Dr A. N. Walker
*If* "bonjour" and "fromage" and others become commonly used in English,
in the same way that "restaurant" did, and not just French words that
all educated people happen to know, *then* they will become naturalised
[and may well then change their pronunciation].
So it's a majority vote thing? Well, that's fine by me if that's the
way you personally distinguish "an English word" from "not an English
word". However, as I mentioned below (the example with comparing
"Andropov" to "skibbet") this criterion will throw up some serious
exceptions.
Post by Dr A. N. Walker
Post by Robert de Vincy
Which of the following are "English words"? [...]
None of them. If I, in my capacity as native Englishman with
relatively large passive vocabulary and experience of doing Ximenes,
don't recognise any of them, then they are not English.
Your criterion of what makes a word "English" keeps shifting. Here you
say it's if you recognize it. Earlier you said that it's if the word
is used by the majority. Unless you're the selected representative for
that majority, then the two criteria have the potential to work against
each other.
Post by Dr A. N. Walker
If, OTOH, you can demonstrate that any competent geologist [say] quite
normally goes around saying "I must intravilate the eignes, or the
arbeful deposits of fluviolacustrine will become rhabditiform", then I
am open to persuasion.
I thought it was only if the majority of English speakers use the words.

Your definition of "is an English word" is incredibly slippery.
Post by Dr A. N. Walker
Post by Robert de Vincy
[...] The dictionary we pick may omit our suspect word for reasons
of [...]
- the word has been overlooked.
While I'm here, can you explain why "puther" [as in "the
smoke puthered from the chimney"] is not in any of the dozen+
dictionaries I possess? "Everyone" knows the word. During WW2
there were "puther pots" to put up smoke screens. It's not the
same as "pother", which *is* in most of them.
I've never heard of "puther". Therefore... IT IS NOT AN ENGLISH WORD!
Post by Dr A. N. Walker
Post by Robert de Vincy
[...] How did you arrive at the decision that
"Andropov" is not an English word? Let me guess: "Heads it was, tails
it wasn't"?
No, it's a proper name of someone moderately famous but who
does not yet, and probably never will, have an associated quality
[or whatever] worthy of being an English word, unlike [eg] Plato,
Euclid, Hoover, Churchill. He belongs in an encyclopaedia, not a
dictionary.
And there's another definition from you: if it's a proper noun, then it
must have a distinctive quality or idea that we can associate with that
person/thing.
I do wish you'd make up your mind. The heads/tails thing is looking
more and more inviting by the second.
Post by Dr A. N. Walker
Post by Robert de Vincy
But how many educated people would know what the non-made-up words in
my list above mean? What number are you setting for the quorum of
Englishwordness?
Forty, obviously. No, there is no set rule. I don't know
whether, or to what extent, *you* are English [or British], nor even
to what extent you would, if sufficiently informed, count me as
English. As there are laws about being English that have great
practical consequence, we need set rules about people, but then
there are lots of very tough cases in the margins. We have no such
need for words, as there are no financial/legal consequences. It
doesn't *matter* that "Andropov" is not an English word, but that
"skibbet" is [if indeed it is].
So, equally, it shouldn't matter if we stop drawing such definite
boundaries around those words.
Post by Dr A. N. Walker
Post by Robert de Vincy
And this criterion would actually include "Andropov" into the élite
of Englishwordness, as I'm sure there are more people who know what
"Andropov" refers to than there are who know what, say, "skibbet"
refers to.
Yes. But once you'd explained what the words meant and what
their etymology was and so on,
Ooh, an appeal to etymology, eh? I thought we'd dealt with that earlier.
Post by Dr A. N. Walker
then I'd wager that no-one would change their minds about "Andropov",
whereas you might perhaps find that many would say "Ah yes, I never
realised there was a good old English word for the plastic things you
shove into the corners of shirt collars, I'll start to use it." And of
course there is a huge grey area of the technical words used in maths,
geography, linguistics, etc., and of regional dialect words, which may
have quite limited numbers of users but be invaluable to just those
people.
Are you saying these jargon and dialect terms *ARE* "English words" or
not?
Post by Dr A. N. Walker
[FWIW, I would like -- yes, I know it's sad, but at least it's
not (yet) a substitute for sex -- some good words for jigsaw puzzles.
As in "I'm looking for a piece with three outs, two normal but one
turned up clockwise, and an in, and that has a blue top left shoulder
and a yellow strand across the tab and that is slightly short on the
right". If everyone understood "... a blue-yellow type-seven snub
with right kurtosis", we'd get on much better.]
They might already exist, but only known to expert jigsaw puzzlers.
Maybe they publish their own dictionary?
--
BdeV
Dr A. N. Walker
2004-08-04 14:22:57 UTC
Permalink
"Pass the fromage." "Fromage" has acquired an English determiner, [...]
And then there's the pronunciation. Which English speakers would say
"Pass the fromage" with an uvular <r> in "fromage"?
Oh. Well, that's brilliant. So if some person comes up to you
and says "What's the English for fromage", you will stare at him gone
out and say "But fromage *is* English"? Only if he comes up with an
impeccable Peter Sellers accent and says "What is zee Engleesh for la
fromage" will you say "Cheese"? If we go to France on holiday, and
manage to get by with anglicised grammar and pronunciation but French
vocab, we're speaking English? But proper g/p and poor v is French?
Or what? And, equally, foreign colleagues/students/visitors over here
are still speaking Russian/German/Italian simply because their English
is not very good?

It's a POV, but it makes the whole concept meaningless.
Post by Dr A. N. Walker
*If* "bonjour" and "fromage" and others become commonly used in English,
[...]
So it's a majority vote thing?
No, that's your invention. Neither "commonly used" nor my
previous "known to many educated people" implies majority voting.
Post by Dr A. N. Walker
Post by Robert de Vincy
Which of the following are "English words"? [...]
None of them. If I, in my capacity as native Englishman with
relatively large passive vocabulary and experience of doing Ximenes,
don't recognise any of them, then they are not English.
Your criterion of what makes a word "English" keeps shifting. Here you
say it's if you recognize it.
[Actually, I typed "recognise", but no matter, except to JP.]
Yes, because if I don't recognise it, then nor does a typical educated
person. If, exceptionally, you can show that those words actually
are recognised by a reasonable community, then I'm willing to learn.
Earlier you said that it's if the word
is used by the majority.
No, you inferred that; I didn't imply it.
Unless you're the selected representative for
that majority, then the two criteria have the potential to work against
each other.
True; but I'm representative enough for this purpose.
Post by Dr A. N. Walker
If, OTOH, you can demonstrate that any competent geologist [say] quite
normally goes around saying "I must intravilate the eignes, or the
arbeful deposits of fluviolacustrine will become rhabditiform", then I
am open to persuasion.
I thought it was only if the majority of English speakers use the words.
So you did, but you were wrong. Not only about "majority",
but also about "use" -- every educated speaker of English knows and
understands the common four-letter swear words, but most of us never
use them. All of us have much larger passive than active vocabularies.
[Though Shakespeare might have made an interesting case study.]
I've never heard of "puther". Therefore... IT IS NOT AN ENGLISH WORD!
Are you an educated native speaker of English? Sufficiently
so to be adequately representative?
Post by Dr A. N. Walker
Post by Robert de Vincy
[...] How did you arrive at the decision that
"Andropov" is not an English word? Let me guess: "Heads it was, tails
it wasn't"?
No, it's a proper name of someone moderately famous but who
does not yet, and probably never will, have an associated quality
[or whatever] worthy of being an English word, [...]
And there's another definition from you: if it's a proper noun, then it
must have a distinctive quality or idea that we can associate with that
person/thing.
Not a *new* definition, just an example of the old. Unless you
know something [relevant] that I don't, if someone drops into a debate
the notion that "[whatever] has andropovian resonance" then no-one will
understand that phrase, whereas "platonic", "hoover", ... are words
that many people will understand.
Post by Dr A. N. Walker
[...]. It
doesn't *matter* that "Andropov" is not an English word, but that
"skibbet" is [if indeed it is].
So, equally, it shouldn't matter if we stop drawing such definite
boundaries around those words.
Who is drawing *definite* boundaries? There are grey areas
and border zones. That doesn't stop other areas being black or white.
As of today, "Andropov" and "whistle" are clearly on opposite sides.
That could change [either way] as the language evolves. And indeed
it doesn't *matter*, just a way of filling in the boring period when
I don't have time for research and the A-level results are not yet in.
Post by Dr A. N. Walker
Post by Robert de Vincy
And this criterion would actually include "Andropov" into the ^ilite
of Englishwordness, as I'm sure there are more people who know what
"Andropov" refers to than there are who know what, say, "skibbet"
refers to.
Yes. But once you'd explained what the words meant and what
their etymology was and so on,
Ooh, an appeal to etymology, eh? I thought we'd dealt with that earlier.
Not exactly. The above seems to be my first use of the word
on this newsgroup, and my first on any newsgroup this century; it is
quite rare, but not unknown, on this newsgroup; you have used it four
times. Your dealing amounted to an unsupported assertion that we
can't rely on it; I bow to your superior knowledge in this area, but
don't see that "can't rely" implies "can't use at all".
Are you saying these jargon and dialect terms *ARE* "English words" or
not?
No.
--
Andy Walker, School of MathSci., Univ. of Nott'm, UK.
***@maths.nott.ac.uk
Robert de Vincy
2004-08-04 16:19:10 UTC
Permalink
Post by Dr A. N. Walker
Post by Robert de Vincy
"Pass the fromage." "Fromage" has acquired an English determiner,
[...] And then there's the pronunciation. Which English speakers
would say "Pass the fromage" with an uvular <r> in "fromage"?
Oh. Well, that's brilliant. So if some person comes up to you
and says "What's the English for fromage", you will stare at him gone
out and say "But fromage *is* English"? Only if he comes up with an
impeccable Peter Sellers accent and says "What is zee Engleesh for la
fromage" will you say "Cheese"? If we go to France on holiday, and
manage to get by with anglicised grammar and pronunciation but French
vocab, we're speaking English? But proper g/p and poor v is French?
Or what? And, equally, foreign colleagues/students/visitors over here
are still speaking Russian/German/Italian simply because their English
is not very good?
<exasperation = "slight">

[Note my point elsewhere and earlier in the thread about people not
actually getting the rhetorical point of an argument, and instead
attacking the messenger rather than the message. I'll try again, but
without any form of subtlety that might mislead...]

In the stuff you quoted and the stuff you snipped, I listed some of the
'foreign' things that the word had discarded and some of the 'English'
things that it had acquired in being used in an English sentence.

If it is *still* a 'foreign' word, then what is it that remains to set it
apart from an 'English' word? Is there something there that immediately
tells us that it's 'English'? No. I don't think there is.

This reinforces my original assertion that the idea of "an English word"
is a very vague thing.

[So much for the sharpness of a mathematically-trained mind.]

</exasperation>
Post by Dr A. N. Walker
It's a POV, but it makes the whole concept meaningless.
Exactly. Hallelujah. You've got the point.

You initially stated, without conditions, without further qualification,
that "Andropov" is not an English word. A clear-cut case of black-and-
white decision making.

My immediate (and continuing) response is that the whole idea of "it is/
isn't an English word" is vague, blurry, and can't be pinned down.


[snip the other bits, since if you can't grasp the point of my replies
then they're all just more opportunities to meander]


Okay, let's try again...
What criteria are you using to exclude "Andropov" from being "an English
word"?
If there is a systematic test for Englishwordness, you ought to be able
to describe it; you applied it to "Andropov", it seems.

In reply, I suggest that anything you list or describe will have serious
exceptions, making some very "common" words in English fail, or the process
will rely on purely subjective decision-making.
--
BdeV
Toby
2004-08-04 16:31:20 UTC
Permalink
<snip>
Post by Robert de Vincy
[So much for the sharpness of a mathematically-trained mind.]
Ouch!
Dr A. N. Walker
2004-08-04 17:23:09 UTC
Permalink
In article <***@130.133.1.4>,
Robert de Vincy <***@talk21.com> wrote:
[...]
Post by Robert de Vincy
This reinforces my original assertion that the idea of "an English word"
is a very vague thing.
But that's an unsupported assertion. It's no use pointing
to words that are difficult to assign; that just means there are
grey areas. Would you claim that "addition" is a very vague concept
because it's not clear what adding two glasses each half-full means?
In other cases it can be anything from very sharp [so that no-one
seriously disagrees about "2+3"] through quite sharp ["two apples
plus three apples"] through grey [dates/times, chemicals, ...] to
undefined, taking in some quite technical stuff [directions] on the
way. Most words in use are easy to determine.
Post by Robert de Vincy
Post by Dr A. N. Walker
It's a POV, but it makes the whole concept meaningless.
Exactly. Hallelujah. You've got the point.
*Your* point, but you're wrong.
Post by Robert de Vincy
You initially stated, without conditions, without further qualification,
that "Andropov" is not an English word. A clear-cut case of black-and-
white decision making.
My immediate (and continuing) response is that the whole idea of "it is/
isn't an English word" is vague, blurry, and can't be pinned down.
No, not the *whole* idea. *Some* words are difficult to pin
down, esp while they are making the journey one way or the other.
"Perestroika" used to be clearly not-English, is currently sufficiently
understood to be just about English, and may well drop out of use and
soon cease to be English. If your favourite "skibbet" was ever English,
it's at best marginal today, unless you know more than you have let on
and tell me that everyone in Dorset asks for it in the pub. "Andropov"
is currently not English because no-one, not a single solitary person,
knows what it *means* [though a fair number of people know who the most
famous person with that name was]. If that changes, then it can apply
for naturalisation.
Post by Robert de Vincy
Okay, let's try again...
What criteria are you using to exclude "Andropov" from being "an English
word"?
If there is a systematic test for Englishwordness, you ought to be able
to describe it; you applied it to "Andropov", it seems.
See above. I firstly expect many educated people to be able
to say what it means [as opposed to what it labels], and secondly [to
forestall a silly quibble] that those people do not usually preface
the meaning with "It's a French word that means ..." or similar.
Post by Robert de Vincy
In reply, I suggest that anything you list or describe will have serious
exceptions, making some very "common" words in English fail, or the process
will rely on purely subjective decision-making.
Common failures: surely not, though there will be some grey
areas as manifestly foreign words become assimilated. We could also
argue about whether English place names should be systematically
included ["everyone knows what London means"] or excluded ["Wollaton
means nothing at all to anyone outside Nottingham, it is no more than
a label"] or included/excluded/greyed according to the degree of
resonance ["Liverpool is more than just the place, it conveys also
a style of speech, football, music, etc"]. I can't pretend to care.

Subjective: Yes, why not? Many adjectives are subjective.
A concept like "blue" or "fast" is just as blurry as "English". Some
colours/animals manifestly *are* blue/fast, some manifestly are not,
some are matters of debate. That does not make the concept useless.
--
Andy Walker, School of MathSci., Univ. of Nott'm, UK.
***@maths.nott.ac.uk
David Haardt
2004-08-02 22:45:47 UTC
Permalink
Post by Dr A. N. Walker
Post by Robert de Vincy
Post by Dr A. N. Walker
and the more sensible question is not about its spelling but about its
correct or otherwise transliteration from the Cyrillic. For "Andropov",
this is [AFAIK] scarcely an issue; for "Khruschev", "Tschaikowsky" or
"Niemtsowitsch" it surely is.
Y-e-s... and would it matter if you wrote "Tchaikovski" instead?
In many cases, yes. For example, I have some examples of
Russian sheet music; "Tschaikowsky" sounds more like a sneeze than
the transliteration [according to *my* knowledge of Russian] of the
Cyrillic version of PIT's name, so if he had been less famous, I
might never have spotted that "my" composer and "yours" were the
same person. If you're trying to search a computerised database,
then Affabeck Lauder would never connect "Khru..." and "Xru..." or
"Tschai..." and "Chai"; and the various spellings of "Nimzovich"
are a nightmare for people trying to eliminate duplicates from their
megabases of chess games. You may also be aware that the different
spellings of the name of the Libyan leader are a well known test
case for regular expression analysers.
Well, it depends on your view of what a transliteration should be.

If it should merely be a phonetic representation of the
Cyrillic/Russian original, both "Tch..." and "Ch..." would be
perfectly valid in English.

If it should be a convention-based transliteration, then you will find
out that most authorities/countries have guidelines for "correct"
(let's better call it "socially agreed upon") transliteration.

Still, even among native speakers of English (and English-speaking
linguists, for that matter), there are many competing transliterations
of Russian, all with their respective pros and cons (the same is true
of German, with the possible options including Tschaikowski and
Tschaikowsky). Unless a search facility explicitly takes this into
account, one needs to be aware of that.

If your database included records from countries where different
languages are spoken, the problem would become even bigger.

The same problem of course also exists the other way round, since the
Cyrillic alphabet doesn't include the "h". In the past, the accepted
transliteration was "g" (so that my family name would have been
written and pronounced "Gaardt", and Greek helios ended up being
gelios) while today it usually is kh/x (leading to a Khaardt/Xaardt or
khelios/xelios, resp., representation and pronunciation).

David Haardt
Ray Pang
2004-07-30 17:09:31 UTC
Permalink
Post by Robert de Vincy
Post by Robert de Vincy
[...]
Post by Adam
But, and this I suppose is my overall point about maths, it teaches you
how to approach problems, break them down into smaller things which you
*can* do, and the problem becomes easier.
It does, I agree. In theory. But take a look at Andy Walker's "we don't
interview for Notts maths now" anecdote about giving test questions to
potential maths students. If those students had truly absorbed all the
goodness of mathematical (i.e. logical) thinking then the "We haven't done
this in class" response should have been the exception.
I'm not arguing against your essential point here, but looking at the
evidence and results of those people who *HAVE* done "more maths".
(Okay,
so it's anecdotal and hardly a controlled study, but it's a prominent
enough
Post by Robert de Vincy
behaviour in the right sort of sample of people that it is worthy of being
mentioned.)
I believe (with no actual reason whatsoever) that they answered that more
because they didn't know fully what they were capable of.
If the next line in the interview had been "well try and work it out, let me
see your thought process" then I think that is where strengths in maths
would really be shown.
I think that is precisely what was the next line, IIRC.
Post by Robert de Vincy
I think that maths helps to teach people to learn. Even when dealing with
people, you need to be able to spot patterns in behavour, extrapolate from
past events etc. These things are learnt through experience, but I think
mathematical training will make it easier to learn these things. Perhaps
not obviously, and there would be no clear link (people won't start thinking
"Mr X did Y last week under these circumstances, so he will do Z tomorrow")
but, I guess it's just a gut feeling I have.
The abstraction you've used there (Mr X did Y...) is the abstraction you
learn when you're about 11 or 12. "Mr X has five apples" boils down to x=5.
Post by Robert de Vincy
And I could see the link between the work I do in maths of stating a
proposition, then working through a proof, covering every angle of attack
which might make the proof invalid, and finishing with something which shows
beyond any possible doubt that the proposition is correct.
A proof is its own counterargument. In something like philosophy, for
example, you make your claims, you then cover counterarguments separately.
But you don't know that you've covered them all, and it might not be
possible to cover them all. That's the difference.
Ray Pang
2004-07-30 15:16:53 UTC
Permalink
Post by Robert de Vincy
[...]
Post by Adam
But, and this I suppose is my overall point about maths, it teaches you
how to approach problems, break them down into smaller things which you
*can* do, and the problem becomes easier.
It does, I agree. In theory. But take a look at Andy Walker's "we don't
interview for Notts maths now" anecdote about giving test questions to
potential maths students. If those students had truly absorbed all the
goodness of mathematical (i.e. logical) thinking then the "We haven't done
this in class" response should have been the exception.
Which means that only the good students absorbed the goodness of
mathematical thinking. Which is what he was after.
Post by Robert de Vincy
Post by Adam
Post by Robert de Vincy
Learning a whole load of mathematical proofs and formulas and whatnots
will not help you use a word-processor, will not show you how to create
a safe and recoverable back-up routine, and will certainly not tell you
if formatting an infected floppy will remove the virus.
I'm gonna latch on to the "safe and recoverable back-up routine" cos I
think that being good at maths and being good at devising such routines
*are* linked (tho, of course, there will be exceptions ... but what I'm
saying is it's easier to teach people to think via maths, than via
studying classic literature and devising arguments and writing essays,
even tho both do the same sortof thing).
Yes, I don't doubt that being taught maths to A-level (and higher) standard
means you should be learning how to think about problems in a methodical
way.
1.
Is it really happening in the Real World? I refer you, again, to Mr Walker's
potential uni students. They, presumably, were very proficient at A-level-
standard mathematics and were clearly eager to continue with their exploration
of mathematics, but did the majority exhibit the sort of logical and
systematic thinking that maths *ought to* induce? From the evidence of the
anecdote, it seems not.
Then they weren't eager to continue their exploration. They thought they
might have been, but they weren't. The screening process picked out those
that didn't really know. The eager ones would've used the interview as a
chance to learn.
Post by Robert de Vincy
a) the idea that maths teaches logical thinking is false;
Maths does teach logical thinking. Whether you pick it up or not is a
different matter.
Post by Robert de Vincy
b) the students in the example were a particularly bad sample and do not
exemplify the typical "mathematically trained" student;
I guess Andy wanted the best students, and only a few people are able to be
the best.
Post by Robert de Vincy
or
c) for some people, this "training" does stay with them and make a difference
(and these are the ones that Mr Walker described as managing to solve the
problems), but for the majority, this "training" is lost on them.
I think that is the most accurate suggestion.
Post by Robert de Vincy
There are probably other explanations, but these are the ones that I think
are most likely, given no further details or data.
So, the point of this point is: the techniques taught in mathematics should
lead to logical and methodical thinking, but can we see any convincing
evidence of this, except as minority exceptions?
That is the criticism that mathematical farts like myself like to put at
GCSE and A-level maths exams. They are/were too much like the past papers.
You could practice the past papers and generally know what's coming up.
Post by Robert de Vincy
2.
People don't always behave in mathematically logical (i.e. the definition of
"logical" that Ray Pang used elsewhere in this thread) ways. People behave
in the way that people behave, not in mechanically predictable behaviour
patterns. I've hinted at this... frustration?... before on AUA, and I'll
explicitly say it now. Studying computers and maths and logical problem-
solving is fine and dandy for some areas, but it does not equip you with
the right approach to dealing with people. You need something else...
empathy, understanding, call-it-what-you-will. No amount of mathematical
logic will enable you to understand how a person truly interacts with a
computer.
Indeed. That's why there are so many lousy UI designs out there. Programs
written by computer scientists/mathematicians, without any user testing, for
example, exhibit this.
Post by Robert de Vincy
There has to be that extra "Well, this is what humans think
and want and feel about such-and-such" so that, for example, a workable
back-up routine will, um, work.
Sure, the underlying process might be produced through methodical application
of logical thinking, but that's not the end of it. It's missing the
vital component, the essential component, the component whose priorities and
needs matter more than anything else: the person using this tool that you
have created. Mathematically logical thinking does not teach you how to cope
with that part.
Yes. You're right. People who moan that the "Shut Down" button is the
nearest thing to the "Start" button in Windows are an example of the sort of
thing you're describing. OK, it might not make complete logical sense, but
Microsoft do a hell of a lot of user testing, and their testing defies this
logic.
Post by Robert de Vincy
3.
Schools should teach the art of rhetoric. It's amazing how many weak and
feeble arguments appear on Usenet. And -- even more annoyingly -- how many
times people will reply to an argument mid-paragraph (or mid-sentence, even!)
before the actual point has been made, replying directly to the rhetorical
device rather than the real point that is being set-up to be made. I guess
that's the danger inherent in this medium with the ability to "interrupt" a
person's long essay at any point you choose, but I do think the inability to
think about what the person is saying as a whole is missing from so many
people.
Maths might teach logical reasoning, but a little bit of rhetorical training
will make one's arguing technique much more effective.
It does. If there's something that seems like nonsense, you don't just snap
and pick it out there. You think "hang on, let me follow this through, and
maybe it'll turn out that it was inspired, or maybe it will highlight that
it's nonsense." Maths isn't as cold and blunt as some people might expect.
It does involve a heck of a lot of craft to be convincing.
Post by Robert de Vincy
There is a whole world of A-level topics out there, and it puzzles me why,
time and again, maths seems to appear in this group and other subjects don't.
There are more maths/comp sci's on here than other subjects, I believe.
Robert de Vincy
2004-07-30 17:45:11 UTC
Permalink
Ray Pang did write:

[...]
Post by Ray Pang
Post by Robert de Vincy
Maths might teach logical reasoning, but a little bit of rhetorical training
will make one's arguing technique much more effective.
It does. If there's something that seems like nonsense, you don't just
snap and pick it out there. You think "hang on, let me follow this
through, and maybe it'll turn out that it was inspired, or maybe it
will highlight that it's nonsense." Maths isn't as cold and blunt as
some people might expect. It does involve a heck of a lot of craft to
be convincing.
Doesn't that contradict what Adam said earlier?

He claimed that at every step along the way (in a mathematical argument),
you can stop and check if what you've done is right/wrong.
But to build an argument that relies on appealing to a person's emotions
and feelings, you can (if you want -- it's optional and dependent on the
approach you take) be contradicting your point (or the "truth") right
until the very end where you turn it all around and make a grand impact
with an eloquent piece of oratory.
Would you ever use something like:
"x = 5
y = 5x = 20
Therefore, xy = 100"?
At any point where there's an error, your sequence will be brought to a
halt and pointed out as being erroneous, won't it? (It's how the old
"1 = 2" piece of prestidigitation confuses some people.)

But in a non-mathematical argument, you could start off by saying "War is
peace" and progress until you've made a point that shows this can be true
(or false) even though you've made a blatant contradiction in your very
opening sentence.
Post by Ray Pang
Post by Robert de Vincy
There is a whole world of A-level topics out there, and it puzzles me
why, time and again, maths seems to appear in this group and other
subjects don't.
There are more maths/comp sci's on here than other subjects, I believe.
Yes, but why are there more maths/compsci people here and fewer Other
Subjecters?
Easy!
Because there are more maths-related posts here that attract them and make
them feel like they can contribute.
Okay, but why are there more maths posts here and fewer Other subject posts?
Easy!
Because there are more maths/compsci people here and fewer Other Subjecters.
Let's go round again (under-5s go free)...
--
BdeV
Ray Pang
2004-07-30 17:57:36 UTC
Permalink
Post by Robert de Vincy
[...]
Post by Ray Pang
Post by Robert de Vincy
Maths might teach logical reasoning, but a little bit of rhetorical training
will make one's arguing technique much more effective.
It does. If there's something that seems like nonsense, you don't just
snap and pick it out there. You think "hang on, let me follow this
through, and maybe it'll turn out that it was inspired, or maybe it
will highlight that it's nonsense." Maths isn't as cold and blunt as
some people might expect. It does involve a heck of a lot of craft to
be convincing.
Doesn't that contradict what Adam said earlier?
He claimed that at every step along the way (in a mathematical argument),
you can stop and check if what you've done is right/wrong.
But to build an argument that relies on appealing to a person's emotions
and feelings, you can (if you want -- it's optional and dependent on the
approach you take) be contradicting your point (or the "truth") right
until the very end where you turn it all around and make a grand impact
with an eloquent piece of oratory.
"x = 5
y = 5x = 20
Therefore, xy = 100"?
Presumably you mean "y=5, x=20".
Post by Robert de Vincy
At any point where there's an error, your sequence will be brought to a
halt and pointed out as being erroneous, won't it? (It's how the old
"1 = 2" piece of prestidigitation confuses some people.)
But in a non-mathematical argument, you could start off by saying "War is
peace" and progress until you've made a point that shows this can be true
(or false) even though you've made a blatant contradiction in your very
opening sentence.
I presume you're unfamiliar with proof by contradtion, in which the tactic
is exactly as you describe. Try and follow this:

I want to prove that if x=5 and y=20 then xy=100.

Suppose not. That is, assume x=5 and y=20 but xy!=100 under some
circumstances (!= is 'not equal').

Regardless of any other circumstances, xy=5*20=100, which contradicts the
assumption, hence the assumption that x=5, y=20 but xy!=100 under some
circumstances is false.

Akin to "War is peace". Let x=war, y=peace. USA implies war, but USA does
not imply peace. Hence x cannot be equal to y. And thus war cannot be peace.
Post by Robert de Vincy
Post by Ray Pang
Post by Robert de Vincy
There is a whole world of A-level topics out there, and it puzzles me
why, time and again, maths seems to appear in this group and other
subjects don't.
There are more maths/comp sci's on here than other subjects, I believe.
Yes, but why are there more maths/compsci people here and fewer Other
Subjecters?
Easy!
Because there are more maths-related posts here that attract them and make
them feel like they can contribute.
Or, as mentioned before, mathsy types are generally have a bit more computer
savvy and are more likely to discover usenet and tell their friends, many of
whom are likely to be mathsy types.
Robert de Vincy
2004-07-30 18:58:29 UTC
Permalink
Post by Ray Pang
Post by Robert de Vincy
Doesn't that contradict what Adam said earlier?
He claimed that at every step along the way (in a mathematical
argument), you can stop and check if what you've done is right/wrong.
But to build an argument that relies on appealing to a person's
emotions and feelings, you can (if you want -- it's optional and
dependent on the approach you take) be contradicting your point (or
the "truth") right until the very end where you turn it all around
and make a grand impact with an eloquent piece of oratory.
"x = 5
y = 5x = 20
Therefore, xy = 100"?
Presumably you mean "y=5, x=20".
Um, no.

I'm saying:
1. Let x = 5
2. Let y = 5x
3. This means y = 20 <==== HERE'S THE MISTAKE/ERROR/FALSEHOOD!
4. Therefore, xy = 100

Now, is that a perfectly valid sequence? Would you accept my "5x = 20"
implication in #4 and use it for any further calculation from that point?
Post by Ray Pang
Post by Robert de Vincy
At any point where there's an error, your sequence will be brought to
a halt and pointed out as being erroneous, won't it? (It's how the
old "1 = 2" piece of prestidigitation confuses some people.)
But in a non-mathematical argument, you could start off by saying
"War is peace" and progress until you've made a point that shows this
can be true (or false) even though you've made a blatant
contradiction in your very opening sentence.
I presume you're unfamiliar with proof by contradtion, in which the
I want to prove that if x=5 and y=20 then xy=100.
Suppose not. That is, assume x=5 and y=20 but xy!=100 under some
circumstances (!= is 'not equal').
Regardless of any other circumstances, xy=5*20=100, which contradicts the
assumption, hence the assumption that x=5, y=20 but xy!=100 under some
circumstances is false.
Akin to "War is peace". Let x=war, y=peace. USA implies war, but USA
does not imply peace. Hence x cannot be equal to y. And thus war
cannot be peace.
I see. But why use that "proof by contradiction" trickery, and what
benefits does it have over "Well, since x=5 and y=20, and 5*20=100, then
xy=100"?
Does it offer the same advantage that a well-crafted piece of oration that
whips up a person's emotions in order to drive a point more forcefully has?
When would it be useful or advantageous or (maybe) the only method? Is it
not just a fancy way of going about the same thing with no real benefits/
difference at the end? Would it turn a sceptic into a believer much more
easily than my "Well, since x=5..." sentence above?
Post by Ray Pang
Post by Robert de Vincy
Post by Ray Pang
Post by Robert de Vincy
There is a whole world of A-level topics out there, and it puzzles
me why, time and again, maths seems to appear in this group and
other subjects don't.
There are more maths/comp sci's on here than other subjects, I believe.
Yes, but why are there more maths/compsci people here and fewer Other
Subjecters?
Easy!
Because there are more maths-related posts here that attract them and
make them feel like they can contribute.
Or, as mentioned before, mathsy types are generally have a bit more
computer savvy and are more likely to discover usenet and tell their
friends, many of whom are likely to be mathsy types.
Yes, that's a better answer, and I didn't even have to ask the question
again!
--
BdeV
Ray Pang
2004-07-30 22:31:24 UTC
Permalink
Post by Robert de Vincy
Post by Ray Pang
Post by Robert de Vincy
Doesn't that contradict what Adam said earlier?
He claimed that at every step along the way (in a mathematical
argument), you can stop and check if what you've done is right/wrong.
But to build an argument that relies on appealing to a person's
emotions and feelings, you can (if you want -- it's optional and
dependent on the approach you take) be contradicting your point (or
the "truth") right until the very end where you turn it all around
and make a grand impact with an eloquent piece of oratory.
"x = 5
y = 5x = 20
Therefore, xy = 100"?
Presumably you mean "y=5, x=20".
Um, no.
1. Let x = 5
2. Let y = 5x
3. This means y = 20 <==== HERE'S THE MISTAKE/ERROR/FALSEHOOD!
4. Therefore, xy = 100
Now, is that a perfectly valid sequence? Would you accept my "5x = 20"
implication in #4 and use it for any further calculation from that point?
Of course not. Don't quite get what you're getting at.
Post by Robert de Vincy
Post by Ray Pang
Post by Robert de Vincy
At any point where there's an error, your sequence will be brought to
a halt and pointed out as being erroneous, won't it? (It's how the
old "1 = 2" piece of prestidigitation confuses some people.)
But in a non-mathematical argument, you could start off by saying
"War is peace" and progress until you've made a point that shows this
can be true (or false) even though you've made a blatant
contradiction in your very opening sentence.
I presume you're unfamiliar with proof by contradtion, in which the
I want to prove that if x=5 and y=20 then xy=100.
Suppose not. That is, assume x=5 and y=20 but xy!=100 under some
circumstances (!= is 'not equal').
Regardless of any other circumstances, xy=5*20=100, which contradicts the
assumption, hence the assumption that x=5, y=20 but xy!=100 under some
circumstances is false.
Akin to "War is peace". Let x=war, y=peace. USA implies war, but USA
does not imply peace. Hence x cannot be equal to y. And thus war
cannot be peace.
I see. But why use that "proof by contradiction" trickery, and what
benefits does it have over "Well, since x=5 and y=20, and 5*20=100, then
xy=100"?
Does it offer the same advantage that a well-crafted piece of oration that
whips up a person's emotions in order to drive a point more forcefully has?
When would it be useful or advantageous or (maybe) the only method? Is it
not just a fancy way of going about the same thing with no real benefits/
difference at the end? Would it turn a sceptic into a believer much more
easily than my "Well, since x=5..." sentence above?
Well in this case it is pretty stupid, but still valid. There are many cases
where it makes total and utter sense. The classic proof that sqrt(2) is
irrational is a proof by contradiction. You assume it's not irrational, i.e.
it's rational, follow the consequences of it being rational and then you get
a situation where sqrt(2) no longer follows the rules of being rational, so
it couldn't have been rational in the first place. Hence it is irrational. A
direct proof that sqrt(2) is irrational is harder (I think - I don't
actually know of one, and can't really be bothered to try and create one, as
contradiction is easier).

Believe me, proof by contradiction is immensely convincing and often a great
way of trying to prove something. It's handy when it's easier to disprove
the converse of what you're trying to prove, than to prove what you're
trying to prove.

Let's try a real world example.

Suppose I want to prove that not all cars have precisely four wheels.

Proof by contradication goes along the following lines: Assume that all cars
have four wheels. But we know for a fact that Reliant Robins, which are
cars, have precisely three wheels. Which contradicts the assumption. Hence
the only valid conclusion from that is that the assumption is false, and
thus not all cars have four wheels. End of proof.

Direct proof would be to refer to the definition of car. A car is defined as
a motorised device that people drive. Hmm, doesn't get us far. Doesn't
dismiss the possibility that there are no cars without four wheels, so we
have to look elsewhere. In fact, I don't know where I'd look. See how
difficult direct proof can be? See how easy proof by contradiction made it?
Ray Pang
2004-07-30 22:41:12 UTC
Permalink
Post by Ray Pang
Post by Robert de Vincy
Post by Ray Pang
Post by Robert de Vincy
Doesn't that contradict what Adam said earlier?
He claimed that at every step along the way (in a mathematical
argument), you can stop and check if what you've done is right/wrong.
But to build an argument that relies on appealing to a person's
emotions and feelings, you can (if you want -- it's optional and
dependent on the approach you take) be contradicting your point (or
the "truth") right until the very end where you turn it all around
and make a grand impact with an eloquent piece of oratory.
"x = 5
y = 5x = 20
Therefore, xy = 100"?
Presumably you mean "y=5, x=20".
Um, no.
1. Let x = 5
2. Let y = 5x
3. This means y = 20 <==== HERE'S THE MISTAKE/ERROR/FALSEHOOD!
4. Therefore, xy = 100
Now, is that a perfectly valid sequence? Would you accept my "5x = 20"
implication in #4 and use it for any further calculation from that point?
Of course not. Don't quite get what you're getting at.
Post by Robert de Vincy
Post by Ray Pang
Post by Robert de Vincy
At any point where there's an error, your sequence will be brought to
a halt and pointed out as being erroneous, won't it? (It's how the
old "1 = 2" piece of prestidigitation confuses some people.)
But in a non-mathematical argument, you could start off by saying
"War is peace" and progress until you've made a point that shows this
can be true (or false) even though you've made a blatant
contradiction in your very opening sentence.
I presume you're unfamiliar with proof by contradtion, in which the
I want to prove that if x=5 and y=20 then xy=100.
Suppose not. That is, assume x=5 and y=20 but xy!=100 under some
circumstances (!= is 'not equal').
Regardless of any other circumstances, xy=5*20=100, which contradicts the
assumption, hence the assumption that x=5, y=20 but xy!=100 under some
circumstances is false.
Akin to "War is peace". Let x=war, y=peace. USA implies war, but USA
does not imply peace. Hence x cannot be equal to y. And thus war
cannot be peace.
I see. But why use that "proof by contradiction" trickery, and what
benefits does it have over "Well, since x=5 and y=20, and 5*20=100, then
xy=100"?
Does it offer the same advantage that a well-crafted piece of oration that
whips up a person's emotions in order to drive a point more forcefully has?
When would it be useful or advantageous or (maybe) the only method? Is it
not just a fancy way of going about the same thing with no real benefits/
difference at the end? Would it turn a sceptic into a believer much more
easily than my "Well, since x=5..." sentence above?
Well in this case it is pretty stupid, but still valid. There are many
cases where it makes total and utter sense. The classic proof that sqrt(2)
is irrational is a proof by contradiction. You assume it's not irrational,
i.e. it's rational, follow the consequences of it being rational and then
you get a situation where sqrt(2) no longer follows the rules of being
rational, so it couldn't have been rational in the first place. Hence it
is irrational. A direct proof that sqrt(2) is irrational is harder (I
think - I don't actually know of one, and can't really be bothered to try
and create one, as contradiction is easier).
Believe me, proof by contradiction is immensely convincing and often a
great way of trying to prove something. It's handy when it's easier to
disprove the converse of what you're trying to prove, than to prove what
you're trying to prove.
Let's try a real world example.
Suppose I want to prove that not all cars have precisely four wheels.
Proof by contradication goes along the following lines: Assume that all
cars have four wheels. But we know for a fact that Reliant Robins, which
are cars, have precisely three wheels. Which contradicts the assumption.
Hence the only valid conclusion from that is that the assumption is false,
and thus not all cars have four wheels. End of proof.
Direct proof would be to refer to the definition of car. A car is defined
as a motorised device that people drive. Hmm, doesn't get us far. Doesn't
dismiss the possibility that there are no cars without four wheels, so we
have to look elsewhere. In fact, I don't know where I'd look. See how
difficult direct proof can be? See how easy proof by contradiction made it?
In fact, this is precisely what I meant earlier when I said that there's a
lot of craft involved with maths. It's not purely mechanical (er, no pun
intended) - there's spark and genius to quite a lot of proof. When I was
studying maths (god I feel so old now), and came across a proof like this,
I'd think, well why did he/she use contradiction/induction/whatever? But
when you understand why the technique that was used WAS actually used, you
spot the ingenuity and brilliance of it. Sure, there may be alternatives
that reach the same conclusion, but if they're a lot harder to follow, then
it's like an epic story told badly.
Robert de Vincy
2004-07-30 23:17:39 UTC
Permalink
Ray Pang did write:

[...]
Post by Ray Pang
Well in this case it is pretty stupid, but still valid. There are many
cases where it makes total and utter sense. The classic proof that
sqrt(2) is irrational is a proof by contradiction. You assume it's not
irrational, i.e. it's rational, follow the consequences of it being
rational and then you get a situation where sqrt(2) no longer follows
the rules of being rational, so it couldn't have been rational in the
first place. Hence it is irrational. A direct proof that sqrt(2) is
irrational is harder (I think - I don't actually know of one, and
can't really be bothered to try and create one, as contradiction is
easier).
Believe me, proof by contradiction is immensely convincing and often a
great way of trying to prove something. It's handy when it's easier to
disprove the converse of what you're trying to prove, than to prove
what you're trying to prove.
Let's try a real world example.
Suppose I want to prove that not all cars have precisely four wheels.
Proof by contradication goes along the following lines: Assume that
all cars have four wheels. But we know for a fact that Reliant Robins,
which are cars, have precisely three wheels. Which contradicts the
assumption. Hence the only valid conclusion from that is that the
assumption is false, and thus not all cars have four wheels. End of
proof.
Direct proof would be to refer to the definition of car. A car is
defined as a motorised device that people drive. Hmm, doesn't get us
far. Doesn't dismiss the possibility that there are no cars without
four wheels, so we have to look elsewhere. In fact, I don't know where
I'd look. See how difficult direct proof can be? See how easy proof by
contradiction made it?
Okay, I've read that a couple of hundred times (it seems!) and I admit
that I was thrown by the initial "spark" for this particular subthread,
namely Adam's earlier paragraph:
=======================================================================
Post by Ray Pang
My reason is, when doing maths, you can't 'fudge' things ever, you can't say
"oh, well, it's nearly right", you have to make sure that everything you
have done, every step you take, is accurate.
=======================================================================
The word "accurate" clashed with your "assumption" and I got bogged down
in a tangle of cognitive dissonance.

At least I learned something unexpected today!
--
BdeV
Ray Pang
2004-07-30 15:01:45 UTC
Permalink
Post by Adam
Post by Robert de Vincy
Post by Adam
Post by Robert de Vincy
Post by unknown
Do more maths instead :-)
Yep. That's useful for the "real world".
Actually it is.
A familiarity with numbers is essential to modern life,
I would disagree with that. With the implication, anyway, that in our
"modern life" we need familiarity with numbers more and more. Isn't
the hue and cry about innumeracy all about how our "modern life" gives
us computers and calculators and machines that place an extra level
between the common chap-in-the-street and a need to work with actual
numbers?
In other words, we -- now -- have less need to know how to add up or
work out a square-root to a practical precision or whatever because
"modern life" provides devices to do that stuff.
Or if you're talking about a familiarity with numbers that goes beyond
practical arithmetic, then... seriously? Who needs maths to that sort
of level enough to call its acquisition "essential to modern life"?
Okay, maybe 'essential' was a bit strong, I am very slow at adding up, even
worse at subtracting, if someone asked me "seven plus six" i'd have to do
"six plus six plus one" in my head.
That is how I do that sort of thing. 135 - 28. I'd do 135, 105, 107. 135+28.
I'd do 135, 165, 163. It just happens quite quickly.
Post by Adam
But, and this I suppose is my overall point about maths, it teaches you how
to approach problems, break them down into smaller things which you *can*
do, and the problem becomes easier.
Absolutely 100% spot on. One of my maths lecturers was always despondent
when he asked questions in lectures to try and get audience participation.
He always always told us to break the problem down to a simpler case and
work upwards.
Samsonknight
2004-07-29 16:51:04 UTC
Permalink
Post by Robert de Vincy
Post by unknown
Post by Nigel Dooler
I have just read two posts on uk.education.schools-it by IT teachers
asking some amazingly simple IT related questions. Makes you think if
they are asking such questions if they should really be teaching such
a subject. I find these people asking such basic questions quite
shocking.
It's actually quite normal.
IT in schools is a load of toss (e.g. hardly anything useful for the real
world).
Do more maths instead :-)
Yep. That's useful for the "real world".
--
BdeV
I just did 2 years of that subject as part of my A-levels and it was such a
waste of time. SCRAP IT!
John Porcella
2004-07-30 22:59:37 UTC
Permalink
Okay, Nigel, then tell us, does formatting a floppy disk remove viruses on
every single occasion?
--
MESSAGE ENDS.
John Porcella
Post by Nigel Dooler
I have just read two posts on uk.education.schools-it by IT teachers
asking some amazingly simple IT related questions. Makes you think if
they are asking such questions if they should really be teaching such
a subject. I find these people asking such basic questions quite
shocking.
On Sat, 17 Jul 2004 19:19:42 +0100, "Liz Jordan"
If a floppy disk is formatted does this remove viruses?
Thanks
Liz
How do I change the keyboard to a US one to suit some software? It's
XP, btw.
Chris Korhonen
2004-08-02 15:24:26 UTC
Permalink
You'd be surprised how many academics are able to write complex java
client-server applicatons in their sleep are unable to maintain a
WindowsXP install or use applications such as powerpoint.
Loading...