21 Replies Latest reply: Apr 7, 2013 9:43 PM by hxmingzzz RSS

Texture reads return garbage for CLImage2D [with minimal example]

thegregg Newbie
Currently Being Moderated

Hello everyone,

 

first I should say that this is a cross-post from http://www.khronos.org/message_boards/viewtopic.php?f=56&t=5456

Since I didn't get an answer after two weeks I figured I could post here as well.

 

Basically I'm having trouble with texture reads (read_imagef) when the texture size is 960x540 pixels and the format is CL_RGBA and CL_UNORM_INT8 or CL_UNSIGNED_INT8. I get correct results for any other texture size. Also note that the problem only appears on ATi cards on Windows. Any other configuration works (e.g. ATi on Linux, nVidia on Windows/Linux).

 

I made a small example that reproduces the problem on all ATi cards  I could test with: FirePro V7900, HD6900, HD5870. You can get the source here: http://pastebin.com/ZAHMHu5B

 

Can anyone reproduce this problem?

Could this be a driver bug and if so, who should I contact?

 

Let me know if you need more specific information.

 

Kind regards,

Gregg

More Like This

Legend

  • Correct Answers - 4 points
  • Helpful Answers - 2 points