Announcement

Collapse
No announcement yet.

Greatest invention man created throughout history?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • maybe not an invention but the greatest "feature" of our's is our thumbs :D
    Love all, trust a few, do wrong to none; be able for thine enemy rather in power than use; and keep thy friend under thine own life's key; be checked for silence, but never taxed for speech.

    Comment


    • Originally posted by subba View Post
      Try multiplying 217 x 3287 using base 6.

      Try multiplying 4328 x 2138 using Roman numerals.

      Try dividing 446824 by 326 using base 6 or Roman numerals.

      Do the same using base 10.

      To put in the 10th digit a 1 followed by a zero is mankinds BIGGEST jump in civilization. But then..*sigh* JMT.
      You have not proved that humanity's accumulate mathematics knowledge could not have been developed had we adopted another base. The examples you have given are easily done in base 10 because we were educated to use algorithms which work in that base. Performing the kind of arithmetic problems you have given is quite straightforward in any base. We certainly know any mathematics no matter how complicated can be done in base 2 due to the use of binary in computer science (even thought it is impractical for anything orther than simple problems without a computer). Therefore I think to prove your case, you have to demonstrate that the development of our advanced maths knowledge would have been impeded if we had gone with something other than base 10. I think you may have a point, but I just don't know.:)

      Comment


      • I could never figure out why mathmaticians, with nothing else better to do, came up with using bases other than 10. As far as practicality is concerned, they are totally useless.

        As for Roman Numerals, I can add and subtract with them but actually am thinking of the letters as numbers in base 10. The only practical use I know of for them is how to read which Superbowl is playing or not.

        Even numbering chapters of a book with Roman Numerals is used only to separate them from what page they are on.

        Otherwise they are merely an archaic form of number symbols that no longer have a pracical use just as numbers in a base other than 10 will not help you measure up your patio slab and calculate how much concrete is needed.
        Able to leap tall tales in a single groan.

        Comment


        • How about evidence based medicine?
          USS Toledo, SSN 769

          Comment


          • Originally posted by RustyBattleship View Post
            I could never figure out why mathmaticians, with nothing else better to do, came up with using bases other than 10.
            Computer processors. The 32bit processors can only address 2 gigabytes of memory. Hence why we moved off the 16bit onto the 32bit and now the 64bit.

            Comment


            • Originally posted by Officer of Engineers View Post
              Computer processors. The 32bit processors can only address 2 gigabytes of memory. Hence why we moved off the 16bit onto the 32bit and now the 64bit.
              computer transistors can only operate in one of two states. ON or OFF, so we came up with base 2 to represent those states. 1 being ON, 0 being OFF. Octals (base 8), and the more common Hexadecimal (base 16) are pretty much only used to shorten binary numbers to make it easier for humans to read because it is very easy to convert binary into those bases. But the thing with 32/64 bit addressing refers to a computer architecture problem rather than a number base constraint, because they are still binary representations:)

              Comment


              • Can you define a better mathematical representation then?

                Comment


                • Originally posted by Officer of Engineers View Post
                  Can you define a better mathematical representation then?
                  I'm not sure I understand the question Sir, do you mean can I come up with something better than binary in regards to computers?

                  For what they are intended in the area of boolean algebra and fundamental digital concepts (eg. combinatorial/sequential logic circuits) there is nothing better than binary because base 2 is perfect for representing ON/OFF states. Even in the field of programming, if you needed to directly manipulate a register, using binary often gives you better clarity because you can see what you are doing to each bit. The moment you represent a register value in a different base you have to convert it back to binary before you can map out each bit to the register. Of course if you are dealing with a 32bit or 64 bit cpu, this can rapidly become very scary, hence we only use high level languages for those, where everything is abstracted.

                  Comment


                  • I mean based on the decision to use the 2bit logic, there was no choice but to goto a non-base 10 math for computer processing.

                    Comment


                    • Originally posted by Officer of Engineers View Post
                      I mean based on the decision to use the 2bit logic, there was no choice but to goto a non-base 10 math for computer processing.
                      So true, what a computer lacks in the mathematical elegance of base 10 formulae, it makes up for with sheer cpu cycles. At the user end, all equations are still base 10. The way a computer solves those equations fall on millions of cycles and a few fundamental boolean relationships like AND, OR, NOT. How it gets from one end to the other makes me go crazy just trying to think about it.

                      Comment


                      • Air conditioner, shish kebab

                        Comment


                        • Guitar

                          Comment


                          • Mankind's greatest invention

                            Everyone seems to have missed the obvious. Spectacles, eyeglasses, corrective lenses.
                            It is no co-incidence that they became generally available as the Renaissance kicked in*. From then presbyopia (old age shortsightedness) did not immediately retire craftsmen, scholars, artists etc.
                            Just think--before the 15th C, huge numbers of people were unable to continue working after 40. Maybe earlier if one thinks of the effect on eyesight of working in dim candle or oil lamp lit rooms.
                            Civilisation, not to mention civility, has risen and fallen. It is only since the invention of corrective lenses that invention and development has gone on and on.

                            * Purists don't argue. I am very well aware that the first eyewear was in use in Florence in 1200 AD, but as late as the mid 16th C Leonardo da Vinci was still getting into trouble with the inquisition for playing around with lens combinations. Plus the cost would have been beyond the masses.

                            Comment


                            • I have changed my mind. Not antiseptics, not even written language. It's cheese, hands down. I simply cannot imagine life without cheese. I doubt it would be worth living.
                              I enjoy being wrong too much to change my mind.

                              Comment


                              • Originally posted by ArmchairGeneral View Post
                                I have changed my mind. Not antiseptics, not even written language. It's cheese, hands down. I simply cannot imagine life without cheese. I doubt it would be worth living.

                                Ben Gunn,s right with you there A/G , toasted mostly

                                Comment

                                Working...
                                X