Jump to content

OpenCL spec is released


digitallysane

Recommended Posts

I read an article somewhere that said that Nvidia was definitely going to support it on their next generation of cards, and that ATI was probably going to :).

I just wish I could remember where the article was for proof.

M

Link to comment
Share on other sites

Yep, Nvidia and AMD/ATI will support OpenCL.

No official news from Intel that I can find, but they'd be a little crazy not to support it with Larrabee (especially since the next architectural enhancement to the i7 will include Larrabee's AVX (16-wide vector) instruction set). Lots of rumours out there, though.

As long as an OpenCL implementation has default CPU support, then I think its adoption will be more widespread. Having to write two versions of your algorithm, one for the GPU and one for the CPU (in case the user doesn't have a supported GPU) was a main concern of many of the developers I talked with an Nvision. Forever after, you have to keep them in sync, which is a developer's nightmare as the number of GPU algorithms in your code increases.

On the other hand, Microsoft isn't going to support it, presumably because they will support compute shaders in DX11. However, this would only be a problem if you had a pre-8x000 nvidia card, a pre-X1x00 ATI/AMD card, or one of those integrated graphics disasters from Intel, or an older driver that doesn't include OpenCL within the driver package.

Link to comment
Share on other sites

As long as an OpenCL implementation has default CPU support, then I think its adoption will be more widespread. Having to write two versions of your algorithm, one for the GPU and one for the CPU (in case the user doesn't have a supported GPU) was a main concern of many of the developers I talked with an Nvision. Forever after, you have to keep them in sync, which is a developer's nightmare as the number of GPU algorithms in your code increases.

yeah, that would suck.

On the other hand, Microsoft isn't going to support it, presumably because they will support compute shaders in DX11.

Which is to be expected really :). Idiots.

M

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...