Discussion:
Intel partnering with Nvidia on Larrabee GPU / GPGPU ?
(too old to reply)
AirRaid
2007-06-04 22:09:29 UTC
Permalink
http://www.beyond3d.com/content/news/242


Larrabee: 16 Cores, 2GHz, 150W, and more...
Friday 01st June 2007, 06:08:45 PM, written by Arun


It is amazing how much information is out there in the wild, when you
know where to look. TG Daily has just published an article partially
based on a presentation they were tipped off about, and which was
uploaded on the 26th of April. It reveals a substantial amount of new
information, which we will not focus on analysing right now, so we do
encourage you to read it for yourself.

Page 1 discusses the possibility that Larrabee is a joint effort
between NVIDIA and Intel, which we find unlikely, and is possibly just
a misinterpretation of the recently announced patent licensing
agreement between the two companies. Page 2 is much more interesting
however, as they link to the presentation above and also uncover the
hidden Larrabee PCB diagram on slide 16.

We would tend not to agree with most of the analysis and speculation
provided by TG Daily, but it's still worth a good read along with the
presentation, which we are very glad they uncovered. Especially
interesting are slides 16, 17, 19, 24 and 31. That last one includes
some very interesting and previously unknown information on Intel's
upcoming Gesher CPU architecture (aka Sandy Bridge), which is aimed at
the 32nm node in the 2010 timeframe. Larrabee, on the other hand, will
presumably be manufactured on Intel's 45nm process but sport a larger
die size.


http://www.tgdaily.com/content/view/32282/137/


Intel set to announce graphics partnership with Nvidia?

By Wolfgang Gruener, Darren Polkowski
Friday, June 01, 2007 01:26


Chicago (IL) - Intel may soon be announcing a close relationship with
Nvidia, which apparently will be contributing to the company's
Larrabee project, TG Daily has learned. Larrabee is expected to roll
out in 2009 and debut as a floating point accelerator product with a
performance of more than 1 TFlops as well as a high-end graphics card
with dual-graphics capabilities.

Rumors about Intel's Larrabee processor have been floating around for
more than a year. Especially since the product's official announcement
at this year's spring IDF and an accelerating interest in floating
point accelerators, the topic itself and surrounding rumors are
gaining traction every day.

Industry sources told TG Daily that Intel is preparing a "big"
announcement involving technologies that will be key to develop
Larrabee. And at least some of those technologies may actually be
coming from Nvidia, we hear: Our sources described Larrabee as a
"joint effort" between the two companies, which may expand over time.
A scenario in which Intel may work with Nvidia to develop Intel-
tailored discrete graphics solutions is speculation but is considered
to be a likely relationship between the two companies down the road.
Clearly, Intel and Nvidia are thinking well beyond their cross-
licensing agreements that are in place today.

It is unclear when the collaboration will be announced; however,
details could surface as early as June 26, when the International
Supercomputing Conference 2007 will open its doors in Dresden,
Germany.

Asked about a possible announcement with Intel, Nvidia spokesperson
Ken Brown provided us with a brief statement: "We enjoy a good working
relationship with Intel and have agreements and ongoing engineering
activities as a result. This said, we cannot comment further about
items that are covered by confidentiality agreements between Intel and
Nvidia."

Intel replied to our inquiry by saying that the company does "not
comment on rumors and speculation."



The AMD-ATI and Intel-Nvidia thingy

In the light of the AMD-ATI merger, it is only to be expected that the
relationship between Intel and Nvidia is examined on an ongoing basis.
So, what does a closer relationship between Intel and Nvidia mean?

The combination with ATI enabled AMD to grow into a different class of
company. It evolved from being CPU-focused into a platform company
that not only can match some key technologies of Intel, but at least
for now has an edge in areas such as visualization capabilities. At a
recent press briefing, the company showed off some of its ideas and it
was clear to us that especially the area of general purpose GPUs will
pave the way to a whole new world of enterprise and desktop
computing.

Nvidia is taking a similar approach with its CUDA software interface,
which allows developers to take advantage of the (general purpose)
floating point horsepower of Geforce 8 graphics processors - more than
500 GFlops per chip. Intel's Larrabee processor is also aimed at
applications that benefit from floating point acceleration - such as
physics, enhanced AI and ray tracing.

While it has been speculated that Intel may be creating Larrabee with
an IA CPU architecture, we were told there may be more GPU elements in
this processor than we previously had thought. A Larrabee card with a
(general purpose) graphics processing unit will support CPUs in
applications that at least partially benefit from massively parallel
processing (as opposed to the traditional sequential processing); in
gaming, the Larrabee processor can be used for physics processing, for
example.

An imminent collaboration announcement between Intel and Nvidia, which
reminds us of a recent Digitimes story that claimed Nvidia was trading
technologies with Intel, of course, raises the question how close the
relationship between Intel and Nvidia might be. It also raises the
question, once again, if Intel may actually be interested in buying
Nvidia - which could make a whole lot of sense for Intel, but appears
to be rather unlikely at this time. Nvidia could cost Intel more than
$15 billion, given the firm's current market cap of $12.6 billion, and
the talk in Silicon Valley indicates that Nvidia co-founder and CEO
Jen-Hsun Huang isn't really interested in selling the company.

But a deal with Intel, involving the licensing of technologies or even
supply of GPUs could have a huge impact on Nvidia's bottom line and
catapult the company into a new phase of growth. However, a closer
collaboration could be important for Intel as well: AMD's acquisition
of ATI was not a measure to raise the stakes in the graphics market or
to battle Nvidia; it was a move to compete in the future CPU market -
with Intel. Having Nvidia on board provides Intel with a graphics
advantage, at least from today's point of view, and could allow the
company to more easily access advanced graphics technology down the
road.



What we know about Larrabee

Intel has recently shared more information with the public about its
intents in the realm of general purpose GPU (GPGPU). In a presentation
from March 7 of this year, Intel discussed its data parallelism
programming implementation called Ct. The presentation discusses the
use of flat vectors and very large instruction words (VLIW as utilized
in ATI/AMD's R600). In essence, the Ct application programming
language (API) bridges the gap of allowing it to work with existing
legacy APIs and libraries as well as co-exist with current
multiprocessing APIs (Pthreads and OpenMP), yet provides "extended
functionality to address irregular algorithms."

Loading Image...

There are several things to point out from the image above, which is a
block diagram of a board utilizing Larrabee. First is the PCIe 2.0
interface with the system. Intel is currently testing PCIe 2.0 as part
of the Bearlake-X (Beachwood) chipset (commercial name: X38), which
could be coming out as part of the Wolfdale 45 nm processor rollout
late this year or early in 2008. Larrabee won't arrive until 2009, but
our sources indicate that if you buy an X38-based board, you will be
able to run a Larrabee board in such a system.

In the upper right hand corner the power connections indicate 150
watts and 75 watts. These correspond to 8-pin and 6-pin power
connections that we have seen on the recent ATI HD2900XT. Intel
expects the power consumption of such a board to be higher than 150
watts. There are video outputs to the far left and as well as video
in. Larrabee appears to have VIVO functionality as well as HDMI output
based on the audio-in block seen at the top left.
A set of BSI connections are next to the audio in connection. We are
not positive on what the abbreviation stands for but we speculate that
these are connections for using these cards in parallel like ATI's
Crossfire or Nvidia's SLI technologies. Finally, there is the size of
the processor (package). That is over twice the size of current GPUs
as ATI's R600 is roughly 21 mm by 20 mm (420 mm=B2). Intel describes
the
chip as a "discrete high end GPU" on a general purpose platform, using
at least 16 cores and providing a "fully programmable performance of 1
TFlops."

Loading Image...


Moving on we can see that Larrabee will be based on a multi-SIMD
configuration. From other discussions about the chip across the net,
it would seem that each is scalar that works using Vec16 instructions.
That would mean that, for graphics applications, it could work on
blocks of 2x2 pixels at a time. These "in-Order" execution SIMDs will
have floating point 16 (FP16) precision as outlined by IEEE754. Also
to note is the use of a ring memory architecture. From a presentation
by Intel Chief Architect Ed Davis called "tera Tera Tera", Davis
outlines that the internal bandwidth on the bus will be 256 B/cycle
and the external memory will have a bandwidth of 128 GB/s. This is
extremely fast and achievable based on the 1.7-2.5 GHz projections for
the core frequency. Attached to each core will be some form of
texturing unit as well as a dynamically partitioned cache and ring
stop on the memory ring.

In the final image below you will notice that each device will have a
17 GB/s of bandwidth per link. These links tie into a next generation
Southbridge titled "ICH-n" as this is yet to be determined. From
discussions with those in the industry, it would appear that the
external memory might not be soldered into the board but in fact be
plug in modules. The slide denotes DDR3, GDDR, as well as FBD or fully
buffered DIMMs. It will be interesting to see what form this will
actually be implemented as but that is the fun of speculation.

Loading Image...


The current layout of project Larrabee is a deviation of previous
Intel roadmap targets. In a 2005 whitepaper entitled "Platform 2015:
Intel Processor and Platform Evolution for the Next Decade", the
company outlines a series of Xscale processors based on Explicitly
Parallel Instruction Computing or EPIC. Intel has deviated slightly
from its initial roadmap since the release of this paper: Intel sold
Xscale to Marvell last year, which makes it a rather unlikely product
for Larrabee - and could have opened up the discussion for other
processing units.

What is interesting is that rumors that Intel was looking for talent
for an upcoming "project" involving graphics started passing around
already more than a year and a half ago. In August of last year, you
could apply for positions on Career Builder and Intel's own website. A
current generic job description exists on Intel's website.



Concluding note

While this is an interesting approach to graphics, physics, and
general purpose processing, we will be seeing the meat in the final
product as well as the success of acceptance with independent software
vendors (ISVs). In our opinion, the concept of the GPGPU is the most
significant development in the computer environment in at least 15
years. The topic has been gaining ground lately and this new
implementation from Intel could take things to a whole new level. As
for the graphics performance, only time will tell.

It will be interesting to see which role Nvidia will play in Intel's
strategy. Keep a close eye on this one.
Klaus Fehrle
2007-06-05 00:08:15 UTC
Permalink
AirRaid wrote:

<snip>
Post by AirRaid
In our opinion, the concept of the GPGPU is the most
significant development in the computer environment in at least 15
years.
From a limited historical scope of comp.arch of only a quarter-century
(which makes me a rookie here), GPGPU looks basically very similar to a
Coprocessor as we have seen it a longer while ago. It will need an
ISA-extension (similar to X87 as well) and couple of years to exploit
it. It will undoubtedly offer opportunities to enhance capabilities of
the arch significantly - but not to an extent to qualify for the most
significant development within 15 years per se. However i believe there
is indeed something in the context that would qualify for such a
statement - actually not only limited to 15 years, but for the entire
history of X86: It is a concept of open architecture, aka Torrenza.

<snip>

Regards
Klaus

Follow-ups set to comp.arch
fungus
2007-06-05 02:42:14 UTC
Permalink
Post by AirRaid
In our opinion, the concept of the GPGPU is the most
significant development in the computer environment in at least 15
years.
Rubbish, it's just another "coprocessor" - hardly revolutionary.
As long as I can remember there have been number crunching
add-in cards for PCs.

The only exciting aspect of the new GPGPUs is that they're
part of a graphics card which is a fairly easy sell to a normal
person. The computing power of a dedicated maths board
will become much more widely available than before.

The problems I see with the "GPGPU" concept are:

a) The two architectures (ie. "GP" and "GPU"), have conflicting
requirements so compromises have to be made.

The priority is currently "GPU" so the "GP" side is suffering. eg.
You only have "texture" as an input data format and the only
output is "framebuffer" - nice if your data happens to fit that model
but hardly general purpose computing. Even Havok, the darling
application for GPUs, can only be partially accelerated because
of this limitations

b) Graphics cards only have/need precision math and this simply
isn't good enough for many scientific applications.

c) For gamers (the people most likely to have a GPGPU) there's a
bigger problem. When you're doing GP, the GPU is blocked and vice
versa. Every computation done in GP mode is one less polygon visible
on screen. Multi-core CPUs will mostly be sitting idle waiting around
for the graphics card to do its thing instead of doing something
useful.

I was much more interested in AGEIA's PhysX processor when it
appeared. Their add-in cards seem to be much more general
purpose, have real potential to crunch some numbers and they
do it in parallel with the graphics card! This is an ideal situation
for a multi-core machine.


d) Power consumption. The new graphics cards suck power like
there's no tomorrow. For a fully loaded PC you now need a 1Kw
power supply. PCs already consume a decent chunk of the world's
power and I'm not sure that loading everybody's machine up with
one of these graphics cards just to run Aero Graphics is a good
idea.


--
<\___/>
/ O O \
\_____/ FTB. Remove my socks for email address.


Governments, like diapers, should be changed often,
and for the same reason.
AirRaid
2007-06-07 20:30:49 UTC
Permalink
another interesting, though speculative Larrabee article:
___________________
Intel's Larrabee: A killer blow for AMD

Could Larrabee mean another tortuous time for AMD?

tech.co.uk staff
Thursday 07 June 2007 16:19

It's a silly sounding name, Larrabee. But it must fill AMD 's heart
with terror. It's the codename, of course, for a whole family of new
processors being cooked up by Intel . And it promises to add graphics
insult to AMD's existing CPU injuries.

Frankly, things are bad enough for AMD already. Since launch last
summer, the Core 2 processor has been pistol whipping AMD's Athlon
CPUs into burger meat. Meanwhile, AMD's upcoming quad-core competitor,
broadly known as Barcelona, looks like a pretty unambitious effort. It
will certainly have to be some chip to take on Intel's upcoming 45nm
die shrink of the Core 2 chip. Factor in recent reports of a launch
delay for Barcelona and I'm beginning to get the fear about AMD's
ability to compete.

Then there's the spectacular fashion in which the wheels have come off
AMD's recently acquired ATI graphics subsidiary. ATI's all new
flagship graphics DX10 board, the Radeon HD 2900 XT was very late,
extremely underwhelming on arrival and possibly a bit broken. The
midrange variants of the Radeon HD range don't look much healthier:
they've been sent back to the fab for a respin. Not a good sign.

In that context, the emergence of the Larrabee project from Intel is
just further proof of how far ahead of the game Intel appears to be at
moment. For the uninitiated, Larrabee is an all new multi-core
processor design that majors on floating point power.
The full feature set hasn't been revealed as yet, but an official
Intel document turned up on a university website recently that reveals
several fascinating new details.

Try these specs for size. Larrabee will be available in configurations
ranging from 16 to 24 with clock speeds as high as 2GHz and raw
performance in the 1TFlop range. The latter figure is approximately 40
times more than an existing Intel Core 2 Duo chip. Yup, you read it
right. 40 times. And the first Larrabee chips are pencilled in for as
soon as 2009.

Of course, floating point power is just one part of the overall PC
processing equation - Intel will be retaining a conventional CPU
roadmap for general purpose duties based on the existing Core 2
family.

But Larrabee will take Intel into brand new markets. Significantly,
the document confirmed that a variant with full 3D video rendering
capability is on the cards. As we reported earlier this week, the
rumblings on the rumour mill suggest the chip could be a joint effort
with Nvidia.

Either way, the most fascinating aspect of the Larrabee GPU is the
expectation that it could be the first graphics processor to combine
both traditional raster graphics with more advanced ray-tracing
techniques.

Without getting bogged down in the details, suffice to understand that
raster graphics are a bit of a kludge when it comes to simulating
lighting. Ray-tracing is the real deal. Ask any 3D graphics
professional what they think about ray tracing on GPUs and they'll
tell it's a matter of when rather than if.

Of course, AMD and ATI will know perfectly well that ray tracing is
the future. But what must be really worrying is that it presents Intel
with the perfect inflection point to enter the graphics market. ATI
and Nvidia have refined raster graphics to the point where other
companies, including Intel, simply can't compete. But a new age of ray-
traced graphics will level the playing field and might just hand Intel
a chance for the total domination of the PC platform it so dearly
desires.

Jeremy Laird
________________

http://tinyurl.com/2znr39
Marra
2007-06-07 23:05:25 UTC
Permalink
Post by AirRaid
___________________
Intel's Larrabee: A killer blow for AMD
Could Larrabee mean another tortuous time for AMD?
tech.co.uk staff
Thursday 07 June 2007 16:19
It's a silly sounding name, Larrabee. But it must fill AMD 's heart
with terror. It's the codename, of course, for a whole family of new
processors being cooked up by Intel . And it promises to add graphics
insult to AMD's existing CPU injuries.
Frankly, things are bad enough for AMD already. Since launch last
summer, the Core 2 processor has been pistol whipping AMD's Athlon
CPUs into burger meat. Meanwhile, AMD's upcoming quad-core competitor,
broadly known as Barcelona, looks like a pretty unambitious effort. It
will certainly have to be some chip to take on Intel's upcoming 45nm
die shrink of the Core 2 chip. Factor in recent reports of a launch
delay for Barcelona and I'm beginning to get the fear about AMD's
ability to compete.
Then there's the spectacular fashion in which the wheels have come off
AMD's recently acquired ATI graphics subsidiary. ATI's all new
flagship graphics DX10 board, the Radeon HD 2900 XT was very late,
extremely underwhelming on arrival and possibly a bit broken. The
they've been sent back to the fab for a respin. Not a good sign.
In that context, the emergence of the Larrabee project from Intel is
just further proof of how far ahead of the game Intel appears to be at
moment. For the uninitiated, Larrabee is an all new multi-core
processor design that majors on floating point power.
The full feature set hasn't been revealed as yet, but an official
Intel document turned up on a university website recently that reveals
several fascinating new details.
Try these specs for size. Larrabee will be available in configurations
ranging from 16 to 24 with clock speeds as high as 2GHz and raw
performance in the 1TFlop range. The latter figure is approximately 40
times more than an existing Intel Core 2 Duo chip. Yup, you read it
right. 40 times. And the first Larrabee chips are pencilled in for as
soon as 2009.
Of course, floating point power is just one part of the overall PC
processing equation - Intel will be retaining a conventional CPU
roadmap for general purpose duties based on the existing Core 2
family.
But Larrabee will take Intel into brand new markets. Significantly,
the document confirmed that a variant with full 3D video rendering
capability is on the cards. As we reported earlier this week, the
rumblings on the rumour mill suggest the chip could be a joint effort
with Nvidia.
Either way, the most fascinating aspect of the Larrabee GPU is the
expectation that it could be the first graphics processor to combine
both traditional raster graphics with more advanced ray-tracing
techniques.
Without getting bogged down in the details, suffice to understand that
raster graphics are a bit of a kludge when it comes to simulating
lighting. Ray-tracing is the real deal. Ask any 3D graphics
professional what they think about ray tracing on GPUs and they'll
tell it's a matter of when rather than if.
Of course, AMD and ATI will know perfectly well that ray tracing is
the future. But what must be really worrying is that it presents Intel
with the perfect inflection point to enter the graphics market. ATI
and Nvidia have refined raster graphics to the point where other
companies, including Intel, simply can't compete. But a new age of ray-
traced graphics will level the playing field and might just hand Intel
a chance for the total domination of the PC platform it so dearly
desires.
Jeremy Laird
________________
http://tinyurl.com/2znr39
Having been a user of PC's since the early 80's I have always found
myself disappointed with the power of the PC and its graphics.
Just as the speed of graphics get almost acceptable we move on from
DOS to Windows or move up to a higher resolution screen with more
colours !

I now have a pc around 1000 times faster than my first PC and it
struggles to cope with .net v3.0 graphics !

I have developed CAD software since around 1990 and still spend a lot
of time optimising my code to speed up the graphics.
AirRaid
2007-06-08 16:47:18 UTC
Permalink
Post by Marra
Post by AirRaid
___________________
Intel'sLarrabee: A killer blow for AMD
CouldLarrabeemean another tortuous time for AMD?
tech.co.uk staff
Thursday 07 June 2007 16:19
It's a silly sounding name,Larrabee. But it must fill AMD 's heart
with terror. It's the codename, of course, for a whole family of new
processors being cooked up by Intel . And it promises to add graphics
insult to AMD's existing CPU injuries.
Frankly, things are bad enough for AMD already. Since launch last
summer, the Core 2 processor has been pistol whipping AMD's Athlon
CPUs into burger meat. Meanwhile, AMD's upcoming quad-core competitor,
broadly known as Barcelona, looks like a pretty unambitious effort. It
will certainly have to be some chip to take on Intel's upcoming 45nm
die shrink of the Core 2 chip. Factor in recent reports of a launch
delay for Barcelona and I'm beginning to get the fear about AMD's
ability to compete.
Then there's the spectacular fashion in which the wheels have come off
AMD's recently acquired ATI graphics subsidiary. ATI's all new
flagship graphics DX10 board, the Radeon HD 2900 XT was very late,
extremely underwhelming on arrival and possibly a bit broken. The
they've been sent back to the fab for a respin. Not a good sign.
In that context, the emergence of theLarrabeeproject from Intel is
just further proof of how far ahead of the game Intel appears to be at
moment. For the uninitiated,Larrabeeis an all new multi-core
processor design that majors on floating point power.
The full feature set hasn't been revealed as yet, but an official
Intel document turned up on a university website recently that reveals
several fascinating new details.
Try these specs for size.Larrabeewill be available in configurations
ranging from 16 to 24 with clock speeds as high as 2GHz and raw
performance in the 1TFlop range. The latter figure is approximately 40
times more than an existing Intel Core 2 Duo chip. Yup, you read it
right. 40 times. And the firstLarrabeechips are pencilled in for as
soon as 2009.
Of course, floating point power is just one part of the overall PC
processing equation - Intel will be retaining a conventional CPU
roadmap for general purpose duties based on the existing Core 2
family.
ButLarrabeewill take Intel into brand new markets. Significantly,
the document confirmed that a variant with full 3D video rendering
capability is on the cards. As we reported earlier this week, the
rumblings on the rumour mill suggest the chip could be a joint effort
with Nvidia.
Either way, the most fascinating aspect of theLarrabeeGPU is the
expectation that it could be the first graphics processor to combine
both traditional raster graphics with more advanced ray-tracing
techniques.
Without getting bogged down in the details, suffice to understand that
raster graphics are a bit of a kludge when it comes to simulating
lighting. Ray-tracing is the real deal. Ask any 3D graphics
professional what they think about ray tracing on GPUs and they'll
tell it's a matter of when rather than if.
Of course, AMD and ATI will know perfectly well that ray tracing is
the future. But what must be really worrying is that it presents Intel
with the perfect inflection point to enter the graphics market. ATI
and Nvidia have refined raster graphics to the point where other
companies, including Intel, simply can't compete. But a new age of ray-
traced graphics will level the playing field and might just hand Intel
a chance for the total domination of the PC platform it so dearly
desires.
Jeremy Laird
________________
http://tinyurl.com/2znr39
Having been a user of PC's since the early 80's I have always found
myself disappointed with the power of the PC and its graphics.
Just as the speed of graphics get almost acceptable we move on from
DOS to Windows or move up to a higher resolution screen with more
colours !
I now have a pc around 1000 times faster than my first PC and it
struggles to cope with .net v3.0 graphics !
I have developed CAD software since around 1990 and still spend a lot
of time optimising my code to speed up the graphics.- Hide quoted text -
- Show quoted text -
in the 1980s and early 1990s, computers such as the Commodore Amiga,
Sharp X68000 and Fujitsu FM-Towns had MUCH better graphics
capabilities than the fastest PCs (or Macs). Those computers all
had custom chips that went beyond what any PC could do.

I long for a new generation of computers (non IBM PCs) that have the
same edge in graphics over the current fastest PCs as was the case
15-20+ years ago. but all of the best engineers are working on PC-
compatible processors, graphics processors or game consoles.
Didi
2007-06-08 17:48:02 UTC
Permalink
Post by AirRaid
in the 1980s and early 1990s, computers such as the Commodore Amiga,
Sharp X68000 and Fujitsu FM-Towns had MUCH better graphics
capabilities than the fastest PCs (or Macs). Those computers all
had custom chips that went beyond what any PC could do.
I long for a new generation of computers (non IBM PCs) that have the
same edge in graphics over the current fastest PCs as was the case
15-20+ years ago. but all of the best engineers are working on PC-
compatible processors, graphics processors or game consoles.
Well I guess graphics cards have reached the point improving beyond
which
has little to offer the human vision. Besides, there still are
peripheral cards, the issue is not that there is no hardware to
buy, rather it is that the data how to program it are (I think)
kept secret.
OK, then just try to buy a single chip grapics controller,
and you are out of luck (the last one on the market used to be
the b69030 by Chips/Intel/Asiliant - killed a few years ago). ...:-)

And the real real issue is that there is no PPC based computer on the
market
within some reasonable price range, everything is x86 clogged. The PS3
and Xbox do not count, those are computerised TV-sets, programming
data are kept secret.

To add to your point that graphics outside the PC world were way
more innovative during the 80-s - and to my point that today there
is not much left to innovate about graphics - let me mention a recent
experience.

In the 80-s, I had designed a graphics/alphanumeric terminal, proud
4 bits per pixel of colour, 640x408 (right, not 480, had 4x 62256 RAM
chips graphics buffer ...). Perhaps the most innovative/unique feature
it had was the fact that one of the 4 colour attribute bits for the
background of each character was used as a graphics/character switch
for that character position, thus allowing easy switching between the
graphics and character layer on a per character basis - very
convenint (then?).

Recently I had to do an emulation of that old system (had lots
of hardware & PCB designs done on a graphics editor I had written
using this terminal). I emulated the entire system (which consisted
of a 6809 based "computer" and the terminal - which was also 6809
based, although this is irrelevant here).
I emulated the system as a DPS task and it cost me a week
or so to emulate the entire terminal in a DPS window (actually two
of them, one holding the graphics page and another mutiplexed
between characters and graphics, the end result)... All this done on
a small PPC system, which uses off-screen buffers for each window,
running tens of times faster than the original.

Dimiter

------------------------------------------------------
Dimiter Popoff Transgalactic Instruments

http://www.tgi-sci.com
------------------------------------------------------
Post by AirRaid
Post by Marra
Post by AirRaid
___________________
Intel'sLarrabee: A killer blow for AMD
CouldLarrabeemean another tortuous time for AMD?
tech.co.uk staff
Thursday 07 June 2007 16:19
It's a silly sounding name,Larrabee. But it must fill AMD 's heart
with terror. It's the codename, of course, for a whole family of new
processors being cooked up by Intel . And it promises to add graphics
insult to AMD's existing CPU injuries.
Frankly, things are bad enough for AMD already. Since launch last
summer, the Core 2 processor has been pistol whipping AMD's Athlon
CPUs into burger meat. Meanwhile, AMD's upcoming quad-core competitor,
broadly known as Barcelona, looks like a pretty unambitious effort. It
will certainly have to be some chip to take on Intel's upcoming 45nm
die shrink of the Core 2 chip. Factor in recent reports of a launch
delay for Barcelona and I'm beginning to get the fear about AMD's
ability to compete.
Then there's the spectacular fashion in which the wheels have come off
AMD's recently acquired ATI graphics subsidiary. ATI's all new
flagship graphics DX10 board, the Radeon HD 2900 XT was very late,
extremely underwhelming on arrival and possibly a bit broken. The
they've been sent back to the fab for a respin. Not a good sign.
In that context, the emergence of theLarrabeeproject from Intel is
just further proof of how far ahead of the game Intel appears to be at
moment. For the uninitiated,Larrabeeis an all new multi-core
processor design that majors on floating point power.
The full feature set hasn't been revealed as yet, but an official
Intel document turned up on a university website recently that reveals
several fascinating new details.
Try these specs for size.Larrabeewill be available in configurations
ranging from 16 to 24 with clock speeds as high as 2GHz and raw
performance in the 1TFlop range. The latter figure is approximately 40
times more than an existing Intel Core 2 Duo chip. Yup, you read it
right. 40 times. And the firstLarrabeechips are pencilled in for as
soon as 2009.
Of course, floating point power is just one part of the overall PC
processing equation - Intel will be retaining a conventional CPU
roadmap for general purpose duties based on the existing Core 2
family.
ButLarrabeewill take Intel into brand new markets. Significantly,
the document confirmed that a variant with full 3D video rendering
capability is on the cards. As we reported earlier this week, the
rumblings on the rumour mill suggest the chip could be a joint effort
with Nvidia.
Either way, the most fascinating aspect of theLarrabeeGPU is the
expectation that it could be the first graphics processor to combine
both traditional raster graphics with more advanced ray-tracing
techniques.
Without getting bogged down in the details, suffice to understand that
raster graphics are a bit of a kludge when it comes to simulating
lighting. Ray-tracing is the real deal. Ask any 3D graphics
professional what they think about ray tracing on GPUs and they'll
tell it's a matter of when rather than if.
Of course, AMD and ATI will know perfectly well that ray tracing is
the future. But what must be really worrying is that it presents Intel
with the perfect inflection point to enter the graphics market. ATI
and Nvidia have refined raster graphics to the point where other
companies, including Intel, simply can't compete. But a new age of ray-
traced graphics will level the playing field and might just hand Intel
a chance for the total domination of the PC platform it so dearly
desires.
Jeremy Laird
________________
http://tinyurl.com/2znr39
Having been a user of PC's since the early 80's I have always found
myself disappointed with the power of the PC and its graphics.
Just as the speed of graphics get almost acceptable we move on from
DOS to Windows or move up to a higher resolution screen with more
colours !
I now have a pc around 1000 times faster than my first PC and it
struggles to cope with .net v3.0 graphics !
I have developed CAD software since around 1990 and still spend a lot
of time optimising my code to speed up the graphics.- Hide quoted text -
- Show quoted text -
in the 1980s and early 1990s, computers such as the Commodore Amiga,
Sharp X68000 and Fujitsu FM-Towns had MUCH better graphics
capabilities than the fastest PCs (or Macs). Those computers all
had custom chips that went beyond what any PC could do.
I long for a new generation of computers (non IBM PCs) that have the
same edge in graphics over the current fastest PCs as was the case
15-20+ years ago. but all of the best engineers are working on PC-
compatible processors, graphics processors or game consoles.
Miles Bader
2007-06-08 23:15:09 UTC
Permalink
and to my point that today there is not much left to innovate about
graphics
Er, I agree that today's rather homogenized industry is kind of
depressing in many ways, but the above statement is just bizarre.
There's insane amounts of innovation happening in graphics these days...

-miles
--
"1971 pickup truck; will trade for guns"
Didi
2007-06-09 00:47:28 UTC
Permalink
Post by Miles Bader
There's insane amounts of innovation happening in graphics these days...
In graphics computation, yes. But this has little if anything to
do with graphics controllers (things which move framebuffer memory
to the display in a loop), although they are often on the same
chip.

Dimiter

------------------------------------------------------
Dimiter Popoff Transgalactic Instruments

http://www.tgi-sci.com
------------------------------------------------------
Post by Miles Bader
and to my point that today there is not much left to innovate about
graphics
Er, I agree that today's rather homogenized industry is kind of
depressing in many ways, but the above statement is just bizarre.
There's insane amounts of innovation happening in graphics these days...
-miles
--
"1971 pickup truck; will trade for guns"
legalize+ (Richard)
2007-06-09 00:12:38 UTC
Permalink
[Please do not mail me a copy of your followup]
Post by Didi
And the real real issue is that there is no PPC based computer on the
market
within some reasonable price range, everything is x86 clogged. The PS3
and Xbox do not count, those are computerised TV-sets, programming
data are kept secret.
What a bunch of crap! PS3 has the GNU gcc/etc. toolchain for it and
Xbox 360 has XNA Game Studio Express.
Post by Didi
To add to your point that graphics outside the PC world were way
more innovative during the 80-s - and to my point that today there
is not much left to innovate about graphics [...]
This confirms that you are definately suffering from recto-cranial
inversion.
--
"The Direct3D Graphics Pipeline" -- DirectX 9 draft available for download
<http://www.xmission.com/~legalize/book/download/index.html>

Legalize Adulthood! <http://blogs.xmission.com/legalize/>
Didi
2007-06-09 01:02:25 UTC
Permalink
Usually I do not reply to posts which are both rude and stupid but
I am in a good mood now so here it goes.
Post by legalize+ (Richard)
What a bunch of crap! PS3 has the GNU gcc/etc. toolchain for it and
Xbox 360 has XNA Game Studio Express.
I know you can play unix shell etc. stuff on them. But you have a
way to go if you think this is all it takes program it...
(although html writers nowadays do call themselves programmers,
actually?).
Have a look at the sources. They all are up to hypervisor
access, in hypervisor mode the PS3 OS takes over
and does the hardware access for you. Try getting info on
that Toshiba peripheral chip - good luck with that.

Like I said, yet another computerised TV-set.
Good enough for public consumtion, how many can tell the
difference between programming and using the remote control.
Can you?

Dimiter

------------------------------------------------------
Dimiter Popoff Transgalactic Instruments

http://www.tgi-sci.com
------------------------------------------------------
Post by legalize+ (Richard)
[Please do not mail me a copy of your followup]
Post by Didi
And the real real issue is that there is no PPC based computer on the
market
within some reasonable price range, everything is x86 clogged. The PS3
and Xbox do not count, those are computerised TV-sets, programming
data are kept secret.
What a bunch of crap! PS3 has the GNU gcc/etc. toolchain for it and
Xbox 360 has XNA Game Studio Express.
Post by Didi
To add to your point that graphics outside the PC world were way
more innovative during the 80-s - and to my point that today there
is not much left to innovate about graphics [...]
This confirms that you are definately suffering from recto-cranial
inversion.
--
"The Direct3D Graphics Pipeline" -- DirectX 9 draft available for download
<http://www.xmission.com/~legalize/book/download/index.html>
Legalize Adulthood! <http://blogs.xmission.com/legalize/>
Dennis
2007-06-10 02:00:00 UTC
Permalink
Post by Didi
Like I said, yet another computerised TV-set.
Good enough for public consumtion, how many can tell the
difference between programming and using the remote control.
Can you?
Dimiter
Yes
http://www.csc.ncsu.edu/news/news_item.php?id=464
Keith S.
2007-06-08 19:22:45 UTC
Permalink
Post by AirRaid
It's a silly sounding name, Larrabee.
Indeed it is. Rhymes with 'wannabee'.
Keith S.
2007-06-08 19:23:03 UTC
Permalink
Post by AirRaid
It's a silly sounding name, Larrabee.
Indeed it is. Rhymes with 'wannabee'.
AirRaid
2007-06-18 18:05:39 UTC
Permalink
http://community.zdnet.co.uk/blog/0,1000000567,10005497o-2000331777b,00.htm

Intel Larrabee roadmap -- who says there ain't no Santa Cores?
Posted by Rupert Goodwins

More news is leaking out about Larrabee, Intel's many-core x86
project. According to what Google translates as Hiroshige's Goto
Weekly from Japan, there'll be 24 and 32 core variants out in 2009 and
a 48 core chip in 2010. The 24 core variant may even be the 32 core
version in disguise, as a way to ship useful parts when one or more
cores don't work.

Picking my way carefully through the Googleised Japanese, it appears
that the first product Larrabee may appear in is a PCI Express 2
accelerator card - mostly for graphics, but with plenty of other
options for tasks that like lots of high speed floating point. That's
where most of the x86 instruction set enhancements will come too,
together with specialised parallel control instructions. That makes
for interesting comparisons with IBM's Cell, which has a conventional
Power PC core doing control and housekeeping and entirely incompatible
processor units managing the heavy lifting.

Oh, and please not to be confusing the Larrabee with the Polaris,
Intel's other public many-core chip. Polaris is not x86, it's not
going to be a product, it's a testbed and, aside from having lots of
cores (80, as opposed to Larrabee's 24-48) there's not much
similarity. Polaris uses a cross-switch matrix for core
interconnection, Larrabee a 256-byte-per-cycle ring; Polaris has
stacked memory, Larrabee multiple on-chip DRAM controllers (as far as
I can tell)...

Loading...