Welcome, Guest. Please login or register.
Did you miss your activation email?

Username: Password:


Pages: [1]   Go Down

Author Topic: Computer evolution, Man machine interface and the Pleo  (Read 6172 times)

radioastronomer

  • Cycad leaf fancier
  • ** Posts: 26
Computer evolution, Man machine interface and the Pleo
« on: January 07, 2009, 08:09:30 AM »

Note: I had to split my "ramblings" into multiple parts due to board limitations.  :P

Something I wrote late at night so bear with me if it rambles or became incoherent.

Quoting Mr. Peabody; “Sherman; get the wayback machine”!

Back in the 1960’s I watched a show called “Star Trek”. One of the basic themes that ran thru many of those shows was the impact of computers on both the starship and the worlds/societies it visited. From that moment on, I became fascinated with computers and machine intelligence. Moore’s law had yet to be defined and the microprocessor had not yet been invented. Heck, not too long earlier, in 1947, the very first transistor was invented and shortly thereafter, April 25, 1961, the first patent was granted for the integrated circuit. And as they like to say: “the rest is history”.

Anyhow, when Star Trek was being aired for the first time, RTL, DTL, and TTL were still king of the hill. Many of the computers of that time were large mainframes, discrete circuitry, and used electromechanical interfaces such as teletypes. Often there were articles about learning machines in publications such as Scientific American and the like; however, they only gave the first glimmer of the possibilities of where computer technology could (and did) lead us. NASA’s Mercury, Gemini, and Apollo were the ultimate in spacecraft design at the time and the future of manned spaceflight seemed to have no end in sight.

Then the economic “crash” of the late 70s reared its ugly head. Inflation reigned supreme, a hostage crisis became front line news, unemployment loomed over the average American’s head, and the space program pretty much dragged to a halt. Yet even amongst these nation-shaking issues, work on those tiny chips continued. Large scale TTL and other early technology were slowly being superseded by CMOS resulting in Moore’s law continuing unabated. These days, other than some small-scale integration, CMOS has pretty much replaced much of those early silicon designs.

So what is CMOS I hear you cry?

"CMOS ("see-moss"), which stands for complementary metal-oxide semiconductor, is a major class of integrated circuits. CMOS chips include microprocessors, microcontrollers, static RAM, and other digital logic circuits. The central characteristic of the technology is that it only uses significant power when its transistors are switching between on and off states. Consequently, CMOS devices use little power and do not produce as much heat as other forms of logic. CMOS also allows a high density of logic functions on a chip."

"The phrase "metal-oxide-semiconductor" is a reference to the nature of the fabrication process originally used to build CMOS chips. That process created field effect transistors (FETs) having a metal gate electrode placed on top of an oxide insulator, which in turn is on top of a semiconductor material. Instead of metal, today the gate electrodes are almost always made from a different material, polysilicon, but the name CMOS nevertheless continues to be used for the modern descendants of the original process.”

Let me digress a bit further into the culture of those times. Overpopulation was a great worry, the Vietnam War caused a huge backlash by the counter culture types, and racial clashes were commonplace. I still remember walking into bookstores and seeing the posters of the time showing a ruined society with waves of people everywhere. Bike gangs, drugs and hippies were all the rage. During this tumultuous time progress continued on these tiny circuits which would later evolve into the microprocessor. The record player/radio, television, and the telephone, were about all that the average house contained and certainly did not reflect the great technological leaps that were happening at the time. Banking was the closest that folks got to real world computing and the news often had horror stories about computer banking errors. This certainly did not instill trust by the general public into this budding technology. I often heard “what good are they” or “new fangled” during that time frame. Oh most folks knew NASA needed computers, however, they were more of a bother instead of a boon to the general masses.

This mindset cumulated towards the end of the 60’s and into the first part of the 70’s. Many books were written about the “information age”, e.g., “Future Shock” by Alvin Toffler. The supposition was that technology was going to “explode” at such a pace that the average person would be lost in this sea of technology and end up rejecting and/or being buried by the same. I personally did not adhere to that mindset and I so wanted my own computer. Unfortunately, computers were still the realm of either science fiction or large companies. The average individual could not own their very own computer. I would lament this fact to my dad and others and would receive this reply more often than not; “what in the heck would you do with it?”

To this day I remember a neighbor taking me to a Control Data Corporation magnetic core memory plant in or about 1971. What a treat! I stood in awe looking at all of the machines, terminals, teletypes, and computers in this huge building. I was in heaven. This was the place where the memory arrays were built for those huge mainframes. I was allowed into the room with the sea of workstations where women strung these tiny ferrite beads on strands of wire under magnifying glasses. If you have never seen a core memory plane in real life, you are missing out on a real work of art. Each core plane was strung with gold, red and green wires not much bigger than sewing thread. The cores themselves were so tiny; you could barely see the hole in the center of each. All these colored wires caused the core plane to glitter with tiny rainbows of light under the fluorescent lighting. I was in awe. I did make one BIG faux pas. I brushed a fingertip over the top plane of one of the core stacks to feel it not realizing I was causing almost two days work worth of damage to repair. I still feel bad about that to this day.

At the same time, there was another revolution going on that many have even yet not extrapolated to its logical conclusion. A scientist by the name of Dr. Gordon Moore made a postulation (in a paper to the April 19th 1965 edition of Electronics magazine titled, “Cramming more components onto integrated circuits”) that industry would be able to double the number of circuits onto an electronic chip while reducing the cost by one half every year. Quoting from that paper:

 “The complexity for minimum component costs has increased at a rate of roughly a factor of two per year. Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will remain nearly constant for at least 10 years." (Moore 1965)

This was just a simple log-linear relationship between complexity and cost. In 1975 (ten years later), Dr. Moore delivered a paper to the 1975 IEEE International Electron Device Meeting where he showed a plot of semiconductor devices that remarkably followed his prediction very nicely. There were some minor revisions to the curve, most notably his prediction for doubling jumped to about every 18 months as apposed to one year. This curve became to be known as Moore’s Law and even got its own equation.

In a physics class I took, I heard a professor remark “if airplanes had progressed with the same rapidity and complexity of the microprocessor, we would have landed on the Moon ten years after the Wright Brothers flew at Kitty Hawk”. This may have been a bit over dramatic, however, it pushes the point of just how complex and capable these “tiny brains” called microprocessors are becoming.

Enter the microprocessor. Up until this point, the central processing unit of a computer had been made up of a number of different circuits, starting with tubes and relays in the very early machines which then migrated to transistors and finally to discrete integrated circuits.

In 1971, Working at Intel, Dr. Federico Faggin designed the first microprocessor called the 4004. Interesting anecdote: Dr. Faggin was in his lab all alone late one evening in January of 1971. He received his first 4004 Central Processing Unit (CPU) wafer late that day and wanted to test it. Working late into the night testing this processor, I can imagine the whoop as he realized it worked! His wife, Elivia, was the first person to share his triumph. If you ever peel the cover off of a 4004 CPU and look at it under a microscope, (I would not recommend it, they are becoming collector pieces commanding big bucks on EBay) you will notice etched along side of the circuit the initials FF for Federico Faggin.

Note: a circuit wafer is a round slice of silicon with a group of chips etched onto the surface, which are then cut up and packaged into the individual chips we see in our computers.

This little chip was a four-bit processor that contained 2300 transistors designed specifically as a calculator chip. This four-bit “brain” on a chip was rapidly superseded by eight-bit processors that ended up became the mainstay of processor technology throughout the rest of the 70s. However, over the following decades 8-bits has expended to 64 and even 256-bits for some designs.
« Last Edit: January 07, 2009, 08:40:00 AM by mweed »
Logged


radioastronomer

  • Cycad leaf fancier
  • ** Posts: 26
Computer evolution, Man machine interface and the Pleo Part 2
« Reply #1 on: January 07, 2009, 08:12:37 AM »

The result of Moore’s Law has led to a single processor (such as the current Core 2 Extreme QX9650) that contains approximately 820 Million transistors. This is a far cry from the humble beginnings of the first microprocessor with only 2300 transistors. It would take more than a third of a million “I4004” processors (the first processor) to equal the transistor count of a single Core 2 Extreme QX9650 processor. The clock speed of the CPU has increased dramatically as well. The first microprocessor had a maximum clock of 740 kHz. Today we are rapidly approaching 5GHz. With the advent of massive parallel processing and clustering of CPUs, the humble microprocessor has evolved into the “super computer” realm that would have truly boggled the mind back in 1971. Note: Up till now there has been a very hard “wall” (timing between chips) with using multiple processors (massive parallel processing). This has now been solved within the past few months.

To recap:

January 2008. Sticking to Intel (yes I know there is AMD, Hp, IBM, and a host of others out there) the current production run processor for the home computer is called the Penryn. The original 4004 contained 2300 transistors, ran at a clock speed of ~740 kHz, used a PMOS process with 10um line widths, and had a 4-bit architecture. Just 36 years later, The Penryn is a 45nm process (.045um), High-K/Metal Gate, 820 million transistors and can be over-clocked to exceed 4GHz.

Nothing more demonstrates this revolution than the graphics processor (GPU). Back in the days of the main frames, it took boxes the size of a large refrigerator to control a monitor and draw a raster scan such as the IBM 2250 display. Enter Nvidia!

In Sept 1999 Nvidia produced and marketed the worlds first GPU called the GeForce 256. This was an amazing chip: 15 million transistors, hardware Transform &Lighting (T&L), 10 million polygons per second and a memory bandwidth of 64 Megabytes per second. This wowed the world and was hailed as a landmark breakthrough across the entire globe.

In 2008, this very same company marketed their latest GPU called the GeForce 280gtx. It contains 1.4 billion transistors producing a fill rate of 48.2 billion textures/polygons per second along with a memory bandwidth of over 141 Gigabytes per second. And this trend does not look like it is going to stop for the foreseeable future. (We are looking at a timeframe of only 9 years between the two)

You may be wondering where I am going with this.

IMHO, it will not be biological systems that ensure longevity, but silicon instead. There are leaps and bounds I read about every day in this technology. Multiple core CPUs, massive parallel processing (CUDA comes to mind), faster bandwidth, lower latency, higher transistor density, etc. There is some thought in the industry that we may in fact be able to directly link a human brain to a silicon one, effectively expanding the biological into the machine. (Note: just recently a process of marrying silicon to neurons has been invented that does not eventually kill the neurons). Pure layman conjecture here: Would this then ultimately allow for our consciousness to be augmented by silicon in real-time? I would have to say that would be the logical end result (pun intended J).

Another revolution that has been taking place along side of the chips/CPUs is our ability to communicate (both between computers and people). In 1907 Lee De Forest patented the Triode tube. This was the breakthrough that enabled long distance wireless communications. Couple that with Sarnoff, Marconi, Tesla, and others and boom – a parallel revolution took place. We could add the explosion of emergent technologies in programming, medicine, biochemistry, material sciences, metallurgy, batteries/power, etc. I could write a book on each; however, this post is getting way too long as it is. All of this technology is converging into a real revolution that is shaping not only our lives but also the very way we look at the universe itself.

Let’s get back to communications for a moment. For the moment I am only going to discuss machine-to-machine communication since man/machine/robotic communications is the focus of this post.  Over the years people improved and invented new ways for machines to “talk” with one another allowing automated control and information exchange. A number of standards were created such as RS-232, 20ma Current Loop, IEEE-422, 488, etc. Each of these had their advantages and disadvantages, however, by standardizing the communication protocols and signal levels, engineers could design machines in different labs, companies, etc. that would readily communicate.

In 1973, the Department of Defense Advanced Research Projects Agency (DARPA) sponsored a program to link the computers of certain universities and research centers together that were located in different cites. This became what was known as ARPAnet (also sometimes called DARPAnet). Along side of ARPAnet (especially in the 80s) bulletin boards (BBSs) sprang up around the globe effectively linking home computers via phone lines. Many of these bulletin boards were dedicated to individual computer platforms such as Atari, Commodore, Texas Instruments, and the like.  This allowed the various users to share stories, information, and more importantly programs. Unfortunately long distance phone call costs limited the number of bulletin boards and time spent on them for the typical home user.

In 1989, English computer scientist Sir Timothy Berners-Lee, working at CERN, basically invented the World Wide Web thru a process called Hypertext. Thus the ARPAnet, BBSs and the like “morphed” into the World Wide Web as we know it today allowing many different platforms (MAC and PC comes to mind) to share data such as this Pleo Board we are currently on. The history of ARPAnet and the Web can be found in “Where Wizards Stay Up Late”, by Katie Hafner and Matthew Lyon.
Logged

radioastronomer

  • Cycad leaf fancier
  • ** Posts: 26
Computer evolution, Man machine interface and the Pleo Part 3
« Reply #2 on: January 07, 2009, 08:15:00 AM »

There is another real world phenomenon taking place that is still pretty obscure to the mainstream population - virtual worlds:

I used to play Traveler, D&D and a host of other role-playing games during my undergrad years in college. However, computer gaming was getting its legs at the same time. My first computer game was a game called Adventure that we would play late at night on the IBM-360 mainframes in the computer lab. Then Zork came out for the Commodore and Atari and all bets were off.

BOOM! Computer and console gaming became huge. Eventually overtaking paper RPGs as the medium of choice.

Now with the ease of the Internet, graphics, and the speed of personal computers, RPGs have come into their own on the PC. (“Everquest” and “World-of-Warcraft” are but a few examples).

Let us regress for a moment and chat a little about computer/console gaming.

Prior to the 1970s electronic gaming for the masses pretty much was isolated to the electromechanical pinball machines and the like.  There were laboratory prototypes such as “tennis”, “Spacewar”, etc. but those were not available to the general population. Then in 1973 a Game console was introduced to bars that changed everything. That was a game console called Pong.  It had its own screen and within a year over 10,000 of these game machines was sold. It was so wildly popular that Atari finally designed a home version in 1975. It connected to the average family television set and could be played over and over without dropping quarter after quarter. What a remarkable invention this first generation home gaming console was. Eventually newer and netter machines were built with other variation of games such as “breakout”.  

Two things happened concurrently. One - large specialized game machines were designed and sold eventually leading to the gaming arcade craze of the late 70s to mid 80s. Two – home game consoles improved with the advent of the best selling at the time the main one being the Atari 2600.  This home machine accepted cartridges each containing a ROM with a different game allowing homes to have the flexibility of playing different games without having to pony up the cost for a separate console for each game. Fortunately for the arcades, the large coin operated game consoles were specialized for a particular game and could generate exceptional graphics for the time whereas the Atari 2600 was still rather primitive by comparison.  

In 1984 the entire gaming industry pretty much crashed. The typical home console was being bypassed by the emerging graphics of the newer home computers and computer gaming eventually became more popular than the arcade. It took another 3 to 4 years for the industry to recover with the introduction of the Nintendo NES system.

There were a couple of major fallouts from this crash. One of the big changes was the shift in dominance from American industry to the Japanese industry for game consoles and games, which is still in effect today. And because the newer games consoles used more advanced processors and graphics, the home game console began to rival the coin operated arcade machines effectively ending the gaming arcade rave of the 80s.

One of the more popular and soon to be dominant game features was the control of an avatar (simulated person) thru an adventure or first person shooter type of game. These would become more and more sophisticated as processor and graphics engines became the powerhouses of today. A huge improvement was the shift from 2D graphics to complete 3D graphics and worlds. This was concurrently developed for both home gaming consoles and personal computers. These 3D graphics and avatars eventually migrated into the 3D communities that exist on the Internet today.

Now there are many Internet virtual 3D worlds that you can explore using an avatar (e.g., Everquest and others). There are virtual worlds with real-time physics and have very realistic looking textures such as; walls, lawns, forests, grass, brick, stone, marble, water, sky, clouds, bushes, libraries, rooms, (whole towns), etc. that you can walk thru using the avatar of choice and seeing out of your avatar’s eyes other avatars walking thru this same virtual word and being able to congregate and chat. Note: brick looks like brick; add marble, cement, flora and fauna, wood, lakes, waterfalls, pools, metal, etc. Just like being there.

There are whole websites devoted to nothing but textures to build a world/community to add to the existing ones out there already. I know of one, I have access too, that would take you months to explore all the different places. There are Castles, gardens, forests, towns, homes, etc. Even one person made a New York street complete with cabs, noise, and high-rises you could get into (Including riding the elevators). You could take cruise on a cruise ship, swim, ride wave riders, etc. Snow would fall; there was night and day, the moon phases would change. I even walked by a lake where I could see the stars reflected in the water. How cool is that.

The line is getting thin indeed between virtual life and "real" life. For some, I think the virtual life is far more important.

I also remember where we were just a bunch of avatars standing around in front of a bar and grill on a virtual cobblestone street. It was like really being there. However, the folks I was casually chatting with were from all over the world. It was kind of strange walking down a realistic looking street with a group of folks chatting away, knowing in the back of your head, these were people sitting at computers from all over the world. Some even become great friends who ensure they are logged in at the same time so they can meet.

This world we live in is certainly changing. As the fledging virtual community becomes more life-like and more and more people acquire computers than have the video and processing capability, I fully expect many more folks will flock to these fledging virtual communities.  Add 3D motion controlled Virtual Reality Headsets, and you could almost forget you were in a computer-generated world as apposed to the physical one.

How you interact in such a community is by using a 3D avatar. These avatars move just like you do; run, walk, sit, gesture, facial expressions, etc. You can have a non-human avatar as well such as a bird (that flies) cat, dogs, space alien, etc. There are also entire web pages dedicated to building or buying an avatar of your dreams. After you have the basic avatar, you can the buy clothing and/or accessories (again web pages are dedicated to this).

Virtual worlds will do nothing except grow. It is going to be a wild ride for the next 20 years or so as this not only emerges more into the "mainstream" but get so sophisticated, you will have a hard time distinguishing between a virtual community and a "real" (should use the word physical from here out since a virtual community will be just as real as the physical one) community. There are two words that you will run into regarding this; meat-space and cyberspace. Meat-space is a non-technical term for the physical realm we work and live in whereas cyberspace is a term used to denote the virtual “world” residing inside a computer or on the net.
Logged

radioastronomer

  • Cycad leaf fancier
  • ** Posts: 26
Computer evolution, Man machine interface and the Pleo (last part)
« Reply #3 on: January 07, 2009, 08:19:06 AM »

VR is not just a fad either. It is growing FAST. Even the US Army has gotten involved. They are using the VR software from one of these online communities to set up virtual combat simulations for training.

However, there is some bad news and also some good news. The bad news is that we are currently hovering at the hard limits of what is possible with current computer science in terms of virtual worlds, and are already using every sleight-of-hand trick in the book to make them look like they scale better than they do.

There are a number of manifestations of these limits, but the fundamental underlying problem in all systems of this type is that while we know how to make perfectly deterministic entities scale reasonably well, the algorithms for managing any object in a virtual environment that must be measured (because their behavior cannot be predicted) scale very poorly. Consequently, it puts a pretty strict upper bound on the number of independent actors (e.g., individual avatars or the weather they are experiencing) that can occupy the same virtual world. This has led to the "island" model of current virtual worlds, where there are numerous physically isolated virtual world environments with strict limits on the number of entities occupying any one environment, all duct taped together at the seams much to the annoyance of many people. The seams are more apparent than many would like.

Evidence of this exists in that DARPA has perpetually open RFPs for real-time spatial environments that can scale, since the military has all sorts of use for this kind of thing. Static environments with few actors we know how to do well, but rich dynamic environments with lots of actors (or sensor feeds) are technologically infeasible with current computer science. There is a lot more too it than just this, but that is the gist of it.

Also there are bandwidth limitations. At some modest level of scaling, it is more practical to have the server compute cloud generated video frames and push them to the client rather than pushing models and action deltas.

To put it another way, video frames have a roughly fixed bandwidth cost, but pushing models has an exponential worst-case cost as a function of the number of actors. Increasing effective bandwidth within a computer cluster is inexpensive, however, increasing that effective bandwidth all the way out to the last mile is not. And even if you could, then you have to deal with aggravated latency issues.

There is some good news however.  There is a new (patented) family of massively scalable real-time algorithms that will allow us to fly right past the current scaling limitations by many orders of magnitude. Most people do not realize just how hard that scaling wall was, or the fact that we've been right up against it for many years. This has been a fundamental limit in so many industries that thru this solution, it will reshape the way we do many kinds of business.

To recap - Virtual worlds as they exist now are at their limit, but after a quarter century of very little progress on the problem; it has finally been solved. Virtual worlds of the future will be vastly different than the current incarnations. People have wanted to build much richer environments for a long time, but until now have been limited by the technology.

So with these new algorithms, advances in computers and VR headsets; “The Matrix" is not as far off as you may think. By the way, I am not talking a war with machines, but the virtual logging into a world that looks and acts like the "real" one.

And finally on to Automatons!!!

Artificial automatons (mechanical men) were described in literature as far back as ancient Greece. These ranged from simple moving statues to completely autonomous thinking machines such as the golden handmaidens.  As early as the 1100s, intricate mechanical “men” were built to entertain and be marveled at such as Al-Jazari's robot. Some consider Al-Jazari as the father of robotics. In 1738 a mechanical duck was created that ate, flapped its wings, etc. However, these were not truly autonomous. The first autonomous robots that reacted to environmental stimulus were finally developed in the 1940s. 

The same advances in electronics that propelled the computer to what it is today has done the same for the robot. Modern robots now contain central processors, communication devices, computer interfaces, servo mechanisms, computer controlled stepper motors, shape memory alloy (usually nickel-titanium), high speed buses, digital cameras (CMOS or CCD), micro gyros, advanced gear trains, feedback loops, touch sensors, audio sensors, speakers, etc. This allows the modern robot to be very lifelike. Some even can exhibit facial expressions as you interact with said robot.

I remember my introduction to robots in grade school in the book “I Robot” by Isaac Asimov”.  Note: Dr Asimov coined the famous three laws of robotics in that book.  Movies and TV shows also showed robots such as “The Forbidden Planet”, “Silent Running”, and “lost in Space”.  We certainly have come along way from the first faltering steps of the non-autonomous mechanical “men” to the self learning autonomous robot we now can think of as pets and home enhancements (such as the Roomba).

So here we are in 2008. A self autonomous learning pet dinosaur, called Pleo, is a robot that is fast becoming a “pet” in households across the globe and uses all of the afore mentioned technology, engineering, and science. Not only does it use advanced materials, but also it uses processors, motors, programming, CAD, communications, and a whole host of other disciplines that would take me days to list.

Where is it going to end? I certainly cannot say except not only has most future predictions fallen way short or wide of what the future really holds. The movie “Blade Runner” comes to mind. Where do we draw the line as these creatures of rubber, silicon, metal, and glass end up blurring the line between life and non-life? The implications are staggering. As we integrate silicon with carbon – VR, cyberspace, man machine interfaces, prosthetics, carbon enhancements (such as real-time implanted IR vision, implanted microprocessors, et al) I think the face of what we call “human” will have to change. This is not science fiction. We are taking the first faltering steps into this new reality that IMHO will include sentient robotics and artificial intelligence.

All I can say is: It is going to be a wild ride!

Some definition that you may run into as you read more about this in the future:

Automaton

1.   A mechanical figure or contrivance constructed to act as if by its own motive power; robot.
2.   A person or animal that acts in a monotonous, routine manner, without active intelligence.
3.   Something capable of acting automatically or without an external motive force.

Simulacrum

1.   A slight, unreal, or superficial likeness or semblance.
2.   An effigy, image, or representation: a simulacrum of Aphrodite.

Doppelgänger

1.   A ghostly double or counterpart of a living person.

Golem

1.   Jewish Folklore. A figure artificially constructed in the form of a human being and endowed with life.
2.   A stupid and clumsy person; blockhead.
3.   An Automaton or Android.
1.   An automaton in the form of a human being.

Cyborg

1.   A person who’s physiological functioning is aided by or dependent upon a mechanical or electronic device.

Cybernetics

1.   The study of human control functions and of mechanical and electronic systems designed to replace them, involving the application of statistical mechanics to communication engineering.

Robot

1.   A machine that resembles a human and does mechanical, routine tasks on command.
2.   Any machine or mechanical device that operates automatically with humanlike skill.
3.   Operating automatically: a robot train operating between airline terminals.

Urschleim

1.   Original protoplasm from which all life evolved (From the German)

Urschleim in Silicon

1.   The birth of Artificial Intelligence in silicon

Finite State Machine

1.   An abstract machine consisting of a set of states (including the initial state), a set of input events, a set of output events, and a state transition function. The function takes the current state and an input event and returns the new set of output events and the next state. Some states may be designated as "terminal states". The state machine can also be viewed as a function that maps an ordered sequence of input events into a corresponding sequence of (sets of) output events.
Logged

wgb

  • Guest
Re: Computer evolution, Man machine interface and the Pleo
« Reply #4 on: January 07, 2009, 01:43:59 PM »

Very good read!
Logged

fancyfont

  • Cretaceous pleo master
  • * Posts: 2575
  • us Female
  • Pleo(s): Peeky, Pennie, JayCamerasaurus (J.C.), LOA, baby and DinhaJo
  • : 2013 winner2011 winner2008 winnerTomato Harvest Festivals
  • No, Peeky and Pennie, you can't have the Harley!
Re: Computer evolution, Man machine interface and the Pleo
« Reply #5 on: January 07, 2009, 10:31:03 PM »

Now I know what radioastronomer means to you. You really are. 8)
Reading your article brought back memories of a young organ student of mine years ago. During the Christmas season, early 1980's my student told me his parents were giving him a choice, a new updated electronic organ or an Apple. I had no idea what he was talking about. When he said computer I pictured those huge things that took up many feet of wall space. Little could I have imagined at that time what an impact his decision would become. Needless to say, he chose and Apple over an organ.
I sure never thought I'd love the computer as much as I do now, or have a robotic dinosaur. VR will be the next step. Sounds pretty exciting. :)   
« Last Edit: January 07, 2009, 10:39:31 PM by fancyfont »
Logged

mweed

  • Grand Poobah
  • Triassic pleo master
  • ***** Posts: 1758
  • us Male
  • Pleo(s): Bob, Mopey
  • : Tomat Harvest Festivals
    • Professor
    • Wile_E_Coyote
    • Dr_Bunsen_Honeydew
    • Snoopy
    • Bob the Pleo
Re: Computer evolution, Man machine interface and the Pleo
« Reply #6 on: January 08, 2009, 12:47:20 PM »

So, radioastronomer, just want to get your opinion on a few things related to your post.

First, even though computing power has progressed rapidly, computer software has not.  Now days you can buy that 64-bit multi-core CPU for your computer, but the bulk of the software running on your machine is still 32-bit and doesn't know how to handle the extra cores.  For the most part, the mainstream programming is 2-3 generations behind the hardware.   As Moore's law continues to play in the hardware realm, won't we reach a point where the increase becomes pointless as the software lags further and further behind?

Secondly, based on Moore's law, many predict we should reach the point in the next 10-12 years that the average PC will have the same computational power as the human brain.  (Right now we're supposedly about the same level as insects.)   But even though it's been around for decades, AI is still in it's rudimentary stages.    It took Ugobe four years to bring Pleo to where it is today, which it is still seen by many as more of a toy than a real life form.  If Moore's law follows true from Aibo to the likes of pleo, why haven't we seen more progression?  Should pleo be more powerful, smaller, and cheaper than aibo?  What we have is cheaper, but much less sophisticated.  This doesn't seem to follow the trend, but is still considered cutting edge?

Finally, since Moore's law is an exponential progression, many futurists (like Kurzweil) talk about an upcoming point of singularity, where the growth rate essentially becomes infinite, how do you see things progressing?
Logged

Bob_the_Pleo

  • Knight of the Order of Pizza
  • Head of the herd
  • ***** Posts: 68
  • we Male
  • : Tomato Harvest Festivals
  • Huh?!?
    • Gilligan
    • Bugs_Bunny
    • Charlie_Brown
    • Bob the Pleo
Re: Computer evolution, Man machine interface and the Pleo
« Reply #7 on: January 09, 2009, 03:21:00 PM »

You lost me at "ramblings".  Of course, generally when I "CMOS", I "eat moss". :P  You'd think with 6 CPUs I understand more.  But then again, I am still running 32-bit code . . .
Logged

Starfire

  • Head of the herd
  • **** Posts: 66
  • Female
  • Pleo's seranade Seaworld
Re: Computer evolution, Man machine interface and the Pleo
« Reply #8 on: January 09, 2009, 08:33:45 PM »

You're right Mweed - Every year we see all these reports of robots that will be released that can help around your house, clean assist the elderly etc.  But except for a few you see demoed you don't see any that are within the reach of the average person.  If Pleo would have been released with the learning ability as planned that would have been a big step forward.  But it seems like we're treading water in this area.
Logged

fancyfont

  • Cretaceous pleo master
  • * Posts: 2575
  • us Female
  • Pleo(s): Peeky, Pennie, JayCamerasaurus (J.C.), LOA, baby and DinhaJo
  • : 2013 winner2011 winner2008 winnerTomato Harvest Festivals
  • No, Peeky and Pennie, you can't have the Harley!
Re: Computer evolution, Man machine interface and the Pleo
« Reply #9 on: January 09, 2009, 09:27:15 PM »

I just got a hold of a 2002 National Geographic Magazine and the first page on the inside cover was about ASIMO by Honda. Title is "We're building a dream one robot at a time.
Before Pleo, I would not have heard about ASIMO.
A paragraph reads, "The future of this exciting technology is even more promising. ASIMO has the potential to respond to simple voice commands, recognize faces, carry loads and even push wheeled objects. This means that one day,ASIMO could be quite useful in some very important tasks. Like assisting the elderly, and even helping with household chores.In essence, ASIMO might serve as another set of eyes, ears and legs for all kinds of people in need."

I think in the future robots will be in every household. I hope nearer then farther! I would love to live to see it in my lifetime.
Pleo is an amazing start. If only he could do the chores around here. :)
« Last Edit: January 09, 2009, 09:29:43 PM by fancyfont »
Logged

Bob_the_Pleo

  • Knight of the Order of Pizza
  • Head of the herd
  • ***** Posts: 68
  • we Male
  • : Tomato Harvest Festivals
  • Huh?!?
    • Gilligan
    • Bugs_Bunny
    • Charlie_Brown
    • Bob the Pleo
Re: Computer evolution, Man machine interface and the Pleo
« Reply #10 on: January 09, 2009, 09:49:56 PM »

Uh . . .   Well, Fred cooks, Salli & Teddi sew, Yogurt teaches, Sally runs the library, Grumpy . . . is grumpy, and James . . . just looks really cool in a tuxedo!
Logged

fancyfont

  • Cretaceous pleo master
  • * Posts: 2575
  • us Female
  • Pleo(s): Peeky, Pennie, JayCamerasaurus (J.C.), LOA, baby and DinhaJo
  • : 2013 winner2011 winner2008 winnerTomato Harvest Festivals
  • No, Peeky and Pennie, you can't have the Harley!
Re: Computer evolution, Man machine interface and the Pleo
« Reply #11 on: January 10, 2009, 10:54:07 AM »

You got me there, Bob. :)
Logged

49er

  • Journeypleo
  • * Posts: 570
  • Male
Re: Computer evolution, Man machine interface and the Pleo
« Reply #12 on: February 06, 2009, 10:16:17 AM »

And after 7 years or more, ASIMO is still attached to his umbilical cord.
Pages: [1]   Go Up
 

SimplePortal 2.3.5 © 2008-2012, SimplePortal