Personal Computers in Traveller

AlphaWhelp said:
I don't feel that keyboards will ever be done away with entirely. Even if Voice recognition is hundreds of times more advanced then than it is now, it won't and can't be perfect.

Think about say an airplane with everyone talking into their computers. Someone next to you is talking and your computer is listening to that person in addition to you...

Personally, I prefer keyboard to voice input.
 
Hi BP,

BP said:
Not to put to fine a point on it - but that's mixing apples and oranges and grapes. ;)
Thanks for that info:)

However, how do you see binary compatibility competing/co-existing with source code and operating system portability?

TTFN,


Ian
 
AndrewW said:
...Nobody will ever need more then 640k.
:D (BTW - that was mostly a Microsoft thing - the hardware guys kept expanding...)

AndrewW said:
...
BP said:
'Free Software' and Open source is about operating system platform compatibility. (And most of this is done commercially - very tiny percentage of users actually re-compile programs for their OS - though I love open source myself.)

Though does offer some hardware independence as well.
Certainly an option - but really not at all a requirement - a free and open source program can be written in machine language (or, as is more common, non-platform dependent manner).

Open Standards have actually have the greatest impact on platform/hardware independence and this concept is often confused with Free and Open Source software movements. Patents and licensing are the greatest market inhibiting factors on truely advanced, open and compatible software and hardware technologies today (along with stock market driven ROIs).
 
BP said:
:D (BTW - that was mostly a Microsoft thing - the hardware guys kept expanding...)

Yeah, I know who said that. Just fit with what you mentioned, even if not a hardware issue.

BP said:
Open Standards have actually have the greatest impact on platform/hardware independence and this concept is often confused with Free and Open Source software movements. Patents and licensing are the greatest market inhibiting factors on truely advanced, open and compatible software and hardware technologies today (along with stock market driven ROIs).

Yup, was meaning in terms of the fact that some of it can be compiled for different hardware platforms. Not something specific to open source. Having the source code does make this easier where someone can port it.
 
AndrewW said:
AlphaWhelp said:
I don't feel that keyboards will ever be done away with entirely. Even if Voice recognition is hundreds of times more advanced then than it is now, it won't and can't be perfect.

Think about say an airplane with everyone talking into their computers. Someone next to you is talking and your computer is listening to that person in addition to you...
I had already thought of this and believe that the future voice recognition tech would include directional voice pickup and also be able to differentiate one persons voice from another.
AlphaWhelp said:
If for example, I'm trying to write a book, and one of the words I want to use is "Vargr" today's VRS would probably interpret that as barge. Even highly advanced VRS won't know that I meant to input Vargr and not Varger or Varg'r or some other strange way to spell Vargr.
No reason voice input can not have the ability to allow for correcting a word; providing a list of several similar sounding words and letting you select one (my keyboard spell checker does this for misspelling); spelling the word out load would be allowed if there is no selection that meets your needs, as mentioned: still allow pop up of a virtual keyboard as needed - add the new voice recognition word into the database just like my typing spell checker adds a new typed word it doesn't recognize. And IF everything were to move more toward voice, there would be no misspelled Varger because there would be no text - storage and output would be audio and no reason to convert to text and simulate a printed page until a user requests such to be done.
 
Interestingly no-one has yet mentioned photonics. IMTU, the computers use light pulses not electrons. One of the features of photonics is that the signals run at, well, the speed of light, you can have multiple light beams of different wavelengths travelling down the same 'channel' in the same direction or opposite direction and the 'channels' can cross each other. This leads to very compact, yet very fast computers that have negligible energy requirements.

In the real world, photonics research has been alive and well for many years. I remember talking to a PhD student working on this in the late 80's/early 90's (sometime before 1994 as that's when I got my PhD) and his prediction was that computers would work in the terahertz frequency. However, the biggest stumbling block so far is getting a data storage system that can record and supply data fast enough.

My colleague and friend, Paul, should chip in here if he reads this (can't remember what his forum name is) as he's a photonics researcher (a senior lecturer in the optoelectronics group).
 
I'm not saying you can't train a VRS to accept Vargr. Obviously, if you can train a VRS to accept cat, you can train it to accept Vargr. My point is simply that at some point of using a computer, unless it's wired directly to your brain, you will need to push a key or two, or more, because absolutely no VRS manufacturer is going to build their VRS with Vargr in the dictionary. You'll have to add it yourself, and the only way to do it is with keystrokes. (it is possible that highly highly advanced VRS would simply accept a verbally dictated v-a-r-g-r, but more realistically, you'd need to access a pronunciation key via keyboard).

Just that there are lots of tools nowadays that perform advanced functions of the knife, such as a bread slicer, mechanical choppers, fry-makers, potato peelers, etc, but if you're a chef, you are going to have to pick up a knife at some point in your career, there is no getting around it.
 
Stainless said:
Interestingly no-one has yet mentioned photonics.
The thread seems to be mostly about human-machine interfacing.

Besides photonics, my TUs since CT days have also made use of fluidic, mechanical (lookup the German guy), gaseous and chemical processing systems... today one can add quantum computing.

Aside from alternative hardware bases, one can also look to Analog vs. Digital and clockless vs. timed systems. These actually exist (even to this day) and are inherently faster and more capable than today's systems in theory - just much more complicated to design and implement. Digital is only 'better' from a design and manufacturing standpoint - in reality (which is largely analog till one starts talking theoretical quantum states) it is inferior.

(BTW - the breakup of AT&T and impact on the world's largest research arm precipitated by cable TV fears is why we do not have the systems your friend predicted... another story.)
 
Stainless said:
Interestingly no-one has yet mentioned photonics. IMTU, the computers use light pulses not electrons.

IMTU, there are no ship computers that have a Fib designation past TL 9 as they ALL are assumed to be at least that.
 
BP said:
Stainless said:
Interestingly no-one has yet mentioned photonics.
The thread seems to be mostly about human-machine interfacing.

Ooops, my bad for only skimming the thread.

OK, as far as looks are concerned, then I would envisage a photonic computer to be extremely small and capable of being made in effectively any shape desired. Thus, they could be seamlessly integrated into furniture, etc. even to the level of being the fine filigree or latticework. Since they use small lasers and such, they will produce very little heat and need only small/modest batteries. So think of one of today's supecomputers at no more than the size of a rubics cube.

As an aside, I also envisage advanced photonic computers as being dynamically hardwired, i.e., the basic block of CPU plus data storage can be dynamically rearranged to clone new CPUs out of the data storage space as and when the demand is required and vice versa. Lastly, all data storage space is effectively the RAM, so no need to pre-cache, etc.
 
all data contained within RAM is utterly destroyed upon shutting the computer down. having a computer that is "all RAM" would only work if the computer would never be turned off, even for a microsecond.
 
AlphaWhelp said:
all data contained within RAM is utterly destroyed upon shutting the computer down. having a computer that is "all RAM" would only work if the computer would never be turned off, even for a microsecond.

Nope. RAM stands for Random Access Memory. It ALL depends on the storage device NOT, the access methodology.
 
DFW said:
AlphaWhelp said:
all data contained within RAM is utterly destroyed upon shutting the computer down. having a computer that is "all RAM" would only work if the computer would never be turned off, even for a microsecond.

Nope. RAM stands for Random Access Memory. It ALL depends on the storage device NOT, the access methodology.

If all you are concerned about is the access methodology, then computers are already there.

Random Access means I can locate, access, and load data in any given file on demand. This is in contrast to Sequential Access where I must progress through a list, loading every previous item in the list before I reach the item I'm actually looking for. Sequential Access has its value, since it takes considerably less memory than Random Access, it is useful for creating files which you know are going to be accessed to EOF 100% of the time, such as a text file. Random Access is used when files are already very large and only portions of the file need to be loaded into memory at any given time.

It's possible that data storage might become so large and vast that the memory saved doing sequential access becomes meaningless, but I don't see how converting everything to random access will actually make computers better than they are now.
 
AlphaWhelp said:
all data contained within RAM is utterly destroyed upon shutting the computer down. having a computer that is "all RAM" would only work if the computer would never be turned off, even for a microsecond.

OK, so I didn't explain myself fully/clearly enough. Currently we have RAM because it's just too slow for the hard disk to supply data as needed by the CPU. Thus there is all sorts of quite impressive technology that tries to anticipate what the CPU will want next and while it's currently working, that data is fetched from the hard disk, put into the RAM and so hopefully relatively quickly accessed. Even SSDs do this (although that may be partly because they are having to work with pre-existing systems that explicitly and unavoidably use RAM). Ever had the problem of "popping" textures in a computer game because the hard disk can't supply the texture files fast enough?

With future photonic systems, the accessing of the data from the 'hard disk' is likely to occur at the same speed as the CPU itself. Thus, no prefetching required. Thus, one of the bottlenecks in current computers is dissolved.
 
Who knows, quantum computers are becoming more and more popular. All major banks use them now for encryption purposes, it's entirely possible that the computer of the future is just utterly fundamentally different than computer technology today, which is not really any different than computer technology 30 years ago, just everything had numbers inflated.

It's not unreasonable to anticipate, say, a computer made out of a single chip repeated over and over again. This chip, dubbed Miracle Chip, reconfigures itself on the fly to suit whatever purposes the computer demands. Running a program that needs 10 gigs graphics memory and 32 gigs of working memory? Suddenly 42 gigs of your empty "disk drive" space reconfigures itself to 10 gigs graphics memory and 32 gigs working memory, which reconfigures itself back when you're done with that program.

And we're not to far away from this kind of tech, already the most advanced military technology is using something called FPGAs or Field-Programmable Gate Arrays, which work almost exactly how I described it above, except for the fact that they have to be manually pre-programmed, they can't do it on their own.
 
FPGAs have been commercially available since mid-80's IIRC - and you probably own devices which have them in it. Larger scale and full general purpose integrated CPU projects were actually done even earlier and have resurfaced today given the limits GPUs are pushing.

And the graphics memory vs. working memory is what happens today in (not as many gigs on most desktops) a great many PCs (to save cost of video memory).

Self optimally re-coding micro circuitry has been around for a long time (FPGAs also play a part of that history). It is also done in code (I wrote re-optimizing runtime machine code back in the late-80's myself for line drawing and various numerical processing routines), though it can be even faster in hardware. Numerous research GPUs in the past have utilized just this approach.

I would point out - having evaluated Gray XMP supercomputers and massively parallel architectures in the 90's - that today's consumer video cards rivaled and exceeded such double-digit million dollar systems several years back (for some things). CUDA and OpenCL are really making an impact in the scientific arenas and have been becoming mainstream for some time.

Now if the rest of the software world can fully catch up to the 60's :D
 
AlphaWhelp said:
Who knows, quantum computers are becoming more and more popular. All major banks use them now for encryption purposes,

Sorry but, no. I work in the space. Currently, they aren't used by banks for encryption as they don't yet exist. ACTUALLY, there is no real quantum encryption currently. There IS a way (using communication over short range fiber only) to check to see if a communication has been VIEWED in transit. This is used to prevent MiM (man in the middle) attacks.
 
DFW said:
... There IS a way (using communication over short range fiber only) to check to see if a communication has been VIEWED in transit. This is used to prevent MiM (man in the middle) attacks.
As a bonus, no technology is foolproof - quantum cryptography systems, like all other technologies, are 'hackable'. (Not implying DFW said otherwise, rather expanding on what he posted.)

In theory, such systems can be perfect - in implementation there are always imperfections (i.e. - weaknesses that can be taken advantage of).

Which is good for game play ;)
 
BP said:
DFW said:
... There IS a way (using communication over short range fiber only) to check to see if a communication has been VIEWED in transit. This is used to prevent MiM (man in the middle) attacks.
As a bonus, no technology is foolproof - quantum cryptography systems, like all other technologies, are 'hackable'. (Not implying DFW said otherwise, rather expanding on what he posted.)

In theory, such systems can be perfect - in implementation there are always imperfections (i.e. - weaknesses that can be taken advantage of).

Which is good for game play ;)

Always good fodder for the game.
 
Back
Top