You can teach middle school children how to define complex numbers, given real numbers as a starting point. You can't necessarily even teach college students or adults how to define real numbers, given rational numbers as a starting point.
well it's hard to formally define them, but it's not hard to say "imagine that all these decimals go on forever" and not worry about the technicalities.
An infinite decimal expansion isn't enough. It has to be an infinite expansion that does not contain a repeating pattern. Naively, this would require an infinite amount of information to specify a single real number in that manner, and so it's not obvious that this is a meaningful or well-founded concept at all.
I don't quite get what you mean here. While you need to allow infinite expansions without repeating patterns, you also need to expansions with these pattern to get all reals. Maybe the most difficult part is to explain why 0.(9) and 1 should be the same, though, while no such identification happens for repeating patterns that are not (9).
Imagine you have a ruler. You want to cut it exactly at 10 cm mark.
Maybe you were able to cut at 10.000, but if you go more precise you'll start seeing other digits, and they will not be repeating. You just picked a real number.
Also, my intuition for why almost all numbers are irrational: if you break a ruler at any random part, and then measure it, the probability is zero that as you look at the decimal digits they are all zero or have a repeating pattern. They will basically be random digits.
> Maybe you were able to cut at 10.000, but if you go more precise you'll start seeing other digits, and they will not be repeating. You just picked a real number.
A reasonably defensible inference would be that adding a finite amount of precision adds a finite number of additional digits. That is a physically realizable operation. There's no obvious physical meaning to the idea of repeating that operation infinitely many times, so this is not clearly a meaningful way of defining or constructing real numbers. If you were trying to use this construction to convince a skeptic that irrational real numbers exist, you would fail -- they would simply retort that arbitrary finite precision exists and that you have failed to demonstrate infinite non-repeating, non-terminating precision.
What are you talking about? Infinite decimals give reals, do they not? Repeating decimals give rational which are a subset of the reals.
The colloquial phrase 'infinite decimal' is perfectly intelligible without reference to whether it's an infinite amount of data or rigorously defined or whatever else.
There's a lot of trickery involved din dealing with the reals formally but they're still easy to conceptualize intuitively.
“What I’m taking about” is that they are not easy to conceptualize intuitively.
If I were a skeptic of real numbers, I’d tell you that talking about an infinite decimal expansion that never terminated and contains no repeating pattern is nonsense. I’d say such a thing doesn’t exist, because you can’t specify a single example by writing down its decimal expansion — by definition. So if that’s the only idea you have to convince a skeptic, you’ve already failed and are out of the game. To convince the skeptic, you’d have to develop a more sophisticated method to show indirectly an example of a real number that is not rational (for instance, perhaps by proving that, should sqrt(2) exist, it cannot be rational).
I guess we are talking about different things. It seems to me that it's trivial to imagine then conceptually. They go on forever and most of them never repeat? Sounds good to me. Sqrt(2) never repeats? sure, whatever. I never found the proofs of this stuff very interesting.
Now, I am a skeptic of their use in physics / science. But that's a different question, and more about pedagogy than the raw content of the theories.
With that approach, all anyone has to say is that you'd have to provide infinite information to specify an example and that the way these objects interact is completely undefined; therefore you haven't defined or done anything at all. You are indeed simply imagining something -- and nothing more. You can imagine whatever you want, but nobody else is inclined to believe that what you imagine exists or behaves in the intended manner.
Beyond that, if a skeptic were inclined to accept the existence of objects with "infinite information content" by definition, they could then ask you to simply add two of them together. That would most likely be the end of it -- trying to add infinite non-repeating decimal expansions does not act intuitively. To answer this type of question in general, you would have to prove that the set of all infinite decimal expansions, if we grant its existence, has a property called completeness, as you would eventually discover that you would have to define addition x+y of these numbers as a limit: x+y = lim_{k -> infinity} (x_k+y_k) where {x,y}_k = the rational number obtained by truncating {x,y} after k digits. You must prove this limit always exists and is unique and well-defined. And even having done all that work, you still couldn't give a single example of one of these numbers without additional nontrivial work, so a skeptic could still easily reject all of this.
This is far beyond what you could reasonably expect the typical middle school student or even general member of the adult population to follow and far more difficult than simply defining complex numbers as having the form x+iy.
yes, I am describing imagining something. Imagine taking decimals and letting them go on without ending. That is conceptualizing them intuitively. It is easy.
I don't really know what you're arguing about. You are describing the sorts of things that have to be solved to construct them rigorously. But I don't know why. No one is talking about that.
I was talking about that, specifically, the relative difficulty of defining reals from rationals vs complex numbers from reals. You replied to me. :)
Moreover, I disagree that you have imagined real numbers. I don’t think you’ve imagined a single real number at all in the manner you describe. Why should I believe you've even described anything that isn't rational to begin with? For instance, 0.999... is the same as 1. Why should I not think that whatever decimal expansion you're imagining is, similarly, equivalent to a rational number we already know about? Occam's razor would reasonably suggest you're just imagining different representations of objects already accounted for in the rationals. After all, an infinite amount of precision captured by an infinite nonrepreating string of digits could easily just converge back to a number we already know.
I am very confused why you are continually talking about rationals as if they are not real. every real number is also a rational number, in the usual conception of things, are they not? Perhaps you are distinguishing the two? like regarding 1.000 as an equivalence classes of cauchy sequences is not the same as 1.000 as the equivalence class of a/a?
because when I picture 1.000 I am clearly imagining a real number. Likewise if I imagine pi, as defined any way you like.
My language was sloppy, but I'll admit I thought it was pretty obvious that we were talking about defining the rest of the reals starting from the rationals -- obvious enough that it didn't need clarification. I can't edit my prior comment, but you may imagine it has been amended in the obvious way with that clarity made explicit rather than implicit and reply to it again if you're interested in continuing the conversation.
You can teach middle school children how to define complex numbers, given real numbers as a starting point. You can't necessarily even teach college students or adults how to define real numbers, given rational numbers as a starting point.