maybe not an invention but the greatest "feature" of our's is our thumbs :D
Announcement
Collapse
No announcement yet.
Greatest invention man created throughout history?
Collapse
X
-
Originally posted by subba View PostTry multiplying 217 x 3287 using base 6.
Try multiplying 4328 x 2138 using Roman numerals.
Try dividing 446824 by 326 using base 6 or Roman numerals.
Do the same using base 10.
To put in the 10th digit a 1 followed by a zero is mankinds BIGGEST jump in civilization. But then..*sigh* JMT.
Comment
-
I could never figure out why mathmaticians, with nothing else better to do, came up with using bases other than 10. As far as practicality is concerned, they are totally useless.
As for Roman Numerals, I can add and subtract with them but actually am thinking of the letters as numbers in base 10. The only practical use I know of for them is how to read which Superbowl is playing or not.
Even numbering chapters of a book with Roman Numerals is used only to separate them from what page they are on.
Otherwise they are merely an archaic form of number symbols that no longer have a pracical use just as numbers in a base other than 10 will not help you measure up your patio slab and calculate how much concrete is needed.Able to leap tall tales in a single groan.
Comment
-
Originally posted by RustyBattleship View PostI could never figure out why mathmaticians, with nothing else better to do, came up with using bases other than 10.
Comment
-
Originally posted by Officer of Engineers View PostComputer processors. The 32bit processors can only address 2 gigabytes of memory. Hence why we moved off the 16bit onto the 32bit and now the 64bit.
Comment
-
Originally posted by Officer of Engineers View PostCan you define a better mathematical representation then?
For what they are intended in the area of boolean algebra and fundamental digital concepts (eg. combinatorial/sequential logic circuits) there is nothing better than binary because base 2 is perfect for representing ON/OFF states. Even in the field of programming, if you needed to directly manipulate a register, using binary often gives you better clarity because you can see what you are doing to each bit. The moment you represent a register value in a different base you have to convert it back to binary before you can map out each bit to the register. Of course if you are dealing with a 32bit or 64 bit cpu, this can rapidly become very scary, hence we only use high level languages for those, where everything is abstracted.
Comment
-
Originally posted by Officer of Engineers View PostI mean based on the decision to use the 2bit logic, there was no choice but to goto a non-base 10 math for computer processing.
Comment
-
Mankind's greatest invention
Everyone seems to have missed the obvious. Spectacles, eyeglasses, corrective lenses.
It is no co-incidence that they became generally available as the Renaissance kicked in*. From then presbyopia (old age shortsightedness) did not immediately retire craftsmen, scholars, artists etc.
Just think--before the 15th C, huge numbers of people were unable to continue working after 40. Maybe earlier if one thinks of the effect on eyesight of working in dim candle or oil lamp lit rooms.
Civilisation, not to mention civility, has risen and fallen. It is only since the invention of corrective lenses that invention and development has gone on and on.
* Purists don't argue. I am very well aware that the first eyewear was in use in Florence in 1200 AD, but as late as the mid 16th C Leonardo da Vinci was still getting into trouble with the inquisition for playing around with lens combinations. Plus the cost would have been beyond the masses.
Comment
-
Comment