Chips go toward fuzzy logic, away from yes/no

Maxinquaye

That cheeky buggerer
Super Member
Registered
Joined
Nov 10, 2009
Messages
10,361
Reaction score
1,032
Location
In your mind
Website
maxoneverything.wordpress.com
http://singularityhub.com/2010/09/04/a-computer-chip-based-on-probability-not-binary-video/

A lot of computing is not about determing a yes or no, but rather to determine maybe. Just look at your inbox next time, and you'll see it in effect. A spam filter relies to an extent about determining the probability of something being spam.

Computer hardward today is ill-equipped to handle this, and it all gets lifted up to the software level where probabilities can be handled. So, wouldn't it be neat if there could be hardware that was built to think in probabilities rather than hard yes/no?

http://singularityhub.com/2010/09/04/a-computer-chip-based-on-probability-not-binary-video/

The Cambridge [Lyric Semiconductor], Massachusetts startup recently came out of stealth to announce the development of their new computer chip that calculates using probabilities. Lyric has used $20 million in DARPA and venture funding to rethink the way we process problems, from the basic architecture of its circuits all the way up to its software language . Everything is grey, not black and white.

It's awesome. Awesome.
 

Ruv Draba

Banned
Joined
Dec 29, 2007
Messages
5,114
Reaction score
1,322
Old news, Maxinqauye. Fuzzy logic has been with us for decades. In the 80s and 90s, the Japanese Fifth Generation Computing project toyed with fuzzy logic for desktop computers. Fuzzy logic was demonstrated then to be a niche function, and has found its way now into industrial applications like trains, and home appliances like rice cookers -- anywhere that systemic tolerance for imprecision is useful. It's nice to have a chip that does the computations more efficiently, but it's not new capability.

On the other hand, the applications for high-speed fuzzy logic computation may well have grown. Fuzzy prediction can be useful for speech recognition, traffic analysis, energy markets and mining Google for marketing data, to name a few. I assume that's why Lyric are investing in a chip to assist.
 
Last edited:

blacbird

Super Member
Registered
Joined
Mar 21, 2005
Messages
36,987
Reaction score
6,158
Location
The right earlobe of North America
Does this mean we now have Yes, No and Maybe? Like, oh, Maybe my word-processor will open this file, if it feels like it today?




Oh, wait, I already have a computer like that . . .
 

benbradley

It's a doggy dog world
Super Member
Registered
Joined
Dec 5, 2006
Messages
20,322
Reaction score
3,513
Location
Transcending Canines
I'm 100% sure this is not a new idea, and it's old enough I can't remember when I first heard about it - maybe '80's, maybe '90's. Here's a book on the topic from 1994:

Fuzzy Logic: The Revolutionary Computer Technology That Is Changing Our World
http://www.amazon.com/dp/0671875353/?tag=absowrit-20

"Fuzzy logic" is a way to program computers so that they can mimic the imprecise way that humans make decisions. This important book traces the dramatic story of Lofti Zadeh, the Iranian-American professor who developed this concept, and his struggle to sell it to the American academic and business communities.

The advance here (presuming it IS one - I'm a bit skeptical - read that book subtitle again, and realize that most people still haven't heard of it 16 years later) is that for the first time "fuzzy logic" is built into the design of this chip.

This is not so much an advance over what compuers can do - traditional (or should I say commercial? :D) computers can be programmed to do fuzzy logic - the advantage of it being built into a chip is it does fuzzy logic directly, and thus is faster than other chips of similar size programmed to do fuzzy logic.

And for completeness, here's The Last Word On Any Topic, the Wikipedia page:
http://en.wikipedia.org/wiki/Fuzzy_logic
 

Maxinquaye

That cheeky buggerer
Super Member
Registered
Joined
Nov 10, 2009
Messages
10,361
Reaction score
1,032
Location
In your mind
Website
maxoneverything.wordpress.com
Heh, I didn't even know there was such a term. I just used some words in the title that i thought appropriate to the subject matter. Logic working on probabilities is going to be more fuzzy than binary. :)
 

Zoombie

Dragon of the Multiverse
Super Member
Registered
Joined
Dec 24, 2006
Messages
40,775
Reaction score
5,947
Location
Some personalized demiplane
I have a vital and important question.

What effect, if any, will this new chip type - if it works on a mass market level - have on the video game market?
 

backslashbaby

~~~~*~~~~
Super Member
Registered
Joined
Feb 12, 2009
Messages
12,635
Reaction score
1,603
Location
NC
I have a vital and important question.

What effect, if any, will this new chip type - if it works on a mass market level - have on the video game market?

Faster. And better:

A big application for Lyric’s new technology will be error correction. 30 nm NAND flash memory will typically have 1 bit wrong per 1000. As we reach to build smaller and smaller chips, that error rate is likely to increase. Lyric Error Correction (LEC) uses their probability processing to counter for mistakes in memory processing. LEC gets the same results as traditional binary chips but in an area 30 times as small, and with only 10% of the power.


The article suggests that the architecture of the chip is different:

While still built on silicon, Lyric’s probability chip uses a completely new architecture for gates. The chip doesn’t process a long series of opens and closed connections as ones and zeros. Instead, there’s a great connectivity between nodes, variables talk to each other, creating a highly parallel processing method. Instead of Boolean logic (And, Or, Not) the chip relies on Bayesian probability logic.

I haven't researched silicon-based stuff in ages, but advances in actual chip technology can be significant.

The theories of the logic definitely aren't new at all, as others have said.
 

SPMiller

Prodigiously Hanged
Super Member
Registered
Joined
Mar 30, 2008
Messages
11,525
Reaction score
1,988
Age
41
Location
Dallas
Website
seanpatrickmiller.com
I dipped into this at university. There are a few obvious applications on the level of bare metal. Otherwise, it still seems niche. Logicians and computer scientists (fuzzy distinction between them; see, e.g., the Curry-Howard Correspondence) figured this out decades ago.
 

Mara

Clever User Title
Super Member
Registered
Joined
Sep 21, 2009
Messages
1,961
Reaction score
343
Location
United States
I've never played White Wolf's Mage, but from what I've heard, the older edition of the game has mages using magical "trinary" computers that had "yes, no, maybe" as possibilities, and the third was what gave them magical powers. :)
 

SPMiller

Prodigiously Hanged
Super Member
Registered
Joined
Mar 30, 2008
Messages
11,525
Reaction score
1,988
Age
41
Location
Dallas
Website
seanpatrickmiller.com
Computer hardware has used ternary logic in the past. Decimal, even. There are certain advantages to binary logic in the engineering processes. Namely, the tolerances. You only have to worry about high/low or on/off. This is oooooold history.

Quantum computers, on the other hand, are much more interesting.
 

backslashbaby

~~~~*~~~~
Super Member
Registered
Joined
Feb 12, 2009
Messages
12,635
Reaction score
1,603
Location
NC
Computer hardware has used ternary logic in the past. Decimal, even. There are certain advantages to binary logic in the engineering processes. Namely, the tolerances. You only have to worry about high/low or on/off. This is oooooold history.

Quantum computers, on the other hand, are much more interesting.

Quantum hardware is what I found more fun to research kind of recently :)

For the non-binary chips, was that for desktops/mass market? I got the impression we were talking about common systems from the article, although I don't know why.
 

backslashbaby

~~~~*~~~~
Super Member
Registered
Joined
Feb 12, 2009
Messages
12,635
Reaction score
1,603
Location
NC
Now I'm really confused :) But yes, these are to be real, 'live' products within a few years:

http://www.theregister.co.uk/2010/08/17/lyric_probability_processor/

I'm not understanding whether you, SP, mean that there has been a physical and realistic implementation of probability logic in hardware, or if you are including conversions to binary, or if you mean theoretical models. Help! :) Maybe just a couple of examples would help, please.

Here's a snippet from the article I linked:

The probability processing that Lyric has invented doesn't do the on/off processing of a normal logic circuit, but rather makes transistors function more like tiny dimmer switches, letting electron flow rates represent the probability of something happening. When you want to reckon the probability of multiple possible events happening, you measure the electrons and that give you the probability, which falls somewhere between 0 and 1. A digital processor has to figure out probabilities, and they do so today in great numbers.
Here's the difference. Reynolds says that a data center filled with servers that are calculating probabilities for, say, a financial model, will be able to consolidate from thousands of servers down to a single GP5 appliance to calculate probabilities. The reason is that the circuits that Lyric has invented - which have over 50 patents pending - are wickedly efficient at this. Digital logic that takes 500 transistors to do a probability multiply operation, for instance, can be done with just a few transistors on the Lyric chips. With an expected factor of 1,000 improvement over general purpose CPUs running probability algorithms, the energy savings of using GP5s instead of, say, x64 chips will be immense.
 

benbradley

It's a doggy dog world
Super Member
Registered
Joined
Dec 5, 2006
Messages
20,322
Reaction score
3,513
Location
Transcending Canines
I have a vital and important question.

What effect, if any, will this new chip type - if it works on a mass market level - have on the video game market?

I second this question.
It means video games might be a lot harder to beat.

I suspect it will be many years before this technology shows up in video games. But when it does, video games may well drive the technology - games have been the main drivers for improving PC video technology for many years (those gamecube/xbox/nintendo things are mostly dedicated PC's with special operating systems for running games).

I haven't researched silicon-based stuff in ages, but advances in actual chip technology can be significant.

Quantum hardware is what I found more fun to research kind of recently :)

For the non-binary chips, was that for desktops/mass market? I got the impression we were talking about common systems from the article, although I don't know why.
Most things (as all desktop computers) are straight binary. Calculators, digital clocks and watches and microwave oven clocks/controllers use BCD, Binary Coded Decimal in which four bits represent one decimal digit value 0 through 9. That allows showing the register contents directly on a display without extra circuitry for conversion to decimal.

But that's not much of a distinction - they're all considered just "binary" - a computer can convert between binary and BCD and whatnot easily enough.

THIS "fuzzy logic" thing only has some narrow applications as far as I know. It may be a while before such a thing is used in consumer products.
 

backslashbaby

~~~~*~~~~
Super Member
Registered
Joined
Feb 12, 2009
Messages
12,635
Reaction score
1,603
Location
NC
This is out of my pay grade, definitely :D

But let's see:

In the IT racket, pulling signals out of noise and doing memory error correction are immediate areas where such probability processing will be immediately useful. As memories and flash get denser and smaller or as I/O bandwidth goes up, error rates go up and the need to correct for errors grows beyond the ability of error correction software or firmware to keep up.

[. . .]

First up is the Lyric Error Correction chip, which is in its second generation today and ready for licensing. This LEC chip is fabbed by Taiwan Semiconductor Manufacturing Corp using a very cheap 180 nanometer process that was perfected a zillion years ago. Reynolds says this chip is perfect for doing error correction on flash memory. Using current 30 nanometer flash memory wafer baking tech, for ever 1,000 bits you store, 1 bit comes back wrong when it is read and needs to be corrected.
Correcting for one error in 10,000 bits was hard enough, but 1 in 1,000 is a lot tougher. The digital ECC circuits used today can clean up errors so only one in every 1,000 trillion errors actually gets through. But, Reynolds says, with the next iteration of flash memory technology, the error rate will be 1 in 100 and error correction will have "a computational burden that is too high."
But the LEC chip that Lyric has created can be tiled to create fixed-function ECC for flash drives that can expand from 1 Gb/sec to 6 Gb/sec of bandwidth and yet be 30 to 70 times smaller than equivalent digital ECC circuits, use one-twelfth the power, and have four times the I/O bandwidth per pin between the flash memory and its controller. Adding such an ECC circuit to will allow flash memory used in mobile devices and servers to have higher bandwidth and a longer field life - and have higher densities without sacrificing data.

Am I crazy for thinking this technology gives a ton more computational power in a teeny little package that takes little energy? Well, I give it a thumbs up :D Except for the need for so much new software, that is; or is that a good thing?
 

Lhun

New kid, be gentle!
Super Member
Registered
Joined
Jan 30, 2007
Messages
1,956
Reaction score
137
I suspect it will be many years before this technology shows up in video games. But when it does, video games may well drive the technology - games have been the main drivers for improving PC video technology for many years (those gamecube/xbox/nintendo things are mostly dedicated PC's with special operating systems for running games).
From what i've read of this so far, i doubt it'll be of much use. Like quantum computing, fuzzy logic is interesting for very specific mathematical operations. Computer games however require straight-up number crunching, something that's being done very well by normal chips and not done any better by quantum or fuzzy logic chips. While it might be possible to program videogames in such a way as to take advantage of a fuzzy logic chip, there's a huge hurdle to overcome. First you need to completely change the way the games are written, which only works in part anyway (e.g. rendering is always just number crunching), and then they'd have to actually run faster on fuzzy logic chips, which is not a given. Quantum computing is a huge advantage mostly for encryption and decryption and pretty much useless for gaming.

Am I crazy for thinking this technology gives a ton more computational power in a teeny little package that takes little energy?
Only for very specific tasks. The example in the article is as an error correcting chip for flash memory. That means more and cheaper flash memory, but not more and cheaper computing power. There are other uses of course, but fuzzy logic chips won't replace current CPUs any time soon.
 
Last edited: