Page 3 of 4 FirstFirst 1234 LastLast
Results 21 to 30 of 31

Thread: OpenGL 3.2 Specification Officially Released

  1. #21
    Join Date
    Aug 2007
    Posts
    6,610

    Default

    Why is it a mistake to use the max out of existing hardware? ATI is free to add the same extensions.

  2. #22
    Join Date
    Jan 2008
    Posts
    772

    Default

    Quote Originally Posted by Qaridarium View Post
    wine only make an mistake... the wine devs think that the extansions was official in the openGL ...
    Wouldn't it be pretty hard to make that mistake, considering that vendor-specific extensions have their own prefixes?

  3. #23
    Join Date
    Jul 2009
    Posts
    351

    Default

    Is there any work towards "sanitizing" OpenGL? It is very hard to implement OpenGL properly in languages like java because you can pass args to OpenGL functions that will crash your program. This sort of thing does not fly in Java. If OpenGL or its wrapper were to do proper boundary checking on all its args it would slow everything down. I wonder if there is anything that can be done.

  4. #24
    Join Date
    Jul 2008
    Location
    Greece
    Posts
    3,788

    Default

    OpenGL does not have to follow Java's brain damaged philosophies

  5. #25
    Join Date
    Jul 2009
    Posts
    351

    Default

    Quote Originally Posted by RealNC View Post
    OpenGL does not have to follow Java's brain damaged philosophies
    Why is it brain-dead to throw an exception instead of crashing? You can recover from an exception.

    If OpenGL were "sanitary" then it could be used in sandboxes for web applets and such. In its current state you have to wrap it up or else it is just a big security hole.

  6. #26
    Join Date
    Jul 2009
    Posts
    351

    Default

    Quote Originally Posted by RealNC View Post
    OpenGL does not have to follow Java's brain damaged philosophies
    Why is it brain-dead to throw an exception instead of crashing? You can recover from an exception.

    If OpenGL were "sanitary" then it could be used in sandboxes for web applets and such. In its current state you have to wrap it up or else it is just a big security hole.

    It's not even just java. It is the same situation for Ruby or Perl or Scheme or python or any other interpreted language.

  7. #27
    Join Date
    Jul 2009
    Posts
    351

    Default

    Have you ever profiled a 3-D app? Even with a fancy high-end card, your program spends the vast majority of its cycles inside of OpenGL calls. This means that you can write your app in a slow interpreted language and it will really not slow things down much at all.

    3-D apps are all about look and it is very subjective, so you do a lot of fussing with the properties of objects to get them to look "right". This means lots of recompiling if you write in C or C++. If you work with an interpreted language you can quickly fix up the look and the action to the way you want. And THEN you can port it to C or C++ if you really want the speed.

    "When writing a program, you should plan to throw the first version away, because you inevitably will, whether you planned to or not" - Gerald Sussman, inventor of Scheme and otherwise really smart guy.

    If one follows his advice then one should always prototype in a nice, easy to work with language. There are enough headaches with development of new code, why give yourlself the extra burden of worrying about pointers and memory allocation when you should be focused on how your game is going to play or how your visualization is going to show the tumor cells?

    The problem with OpenGL is, even your prototype app will dump core if you mess up, and you will still find yourself in gdb looking at stack traces even though you made an effort to keep your head out of the bits and bytes.
    Last edited by frantaylor; 08-04-2009 at 11:58 PM.

  8. #28
    Join Date
    Oct 2007
    Posts
    912

    Default

    OpenGL is very low level for performance reasons - it doesn't suit well to make it work directly with a higher level language, and that's why wrappers / bindings exist for it. OpenGL will return error values however, and these can be checked easily enough. If the error values are not properly reported, it's not the fault of OpenGL, but rather the implementation of it.
    Interpreted languages also do an awful lot of error checking internally, so wrappers could do that quite easily as well.

  9. #29
    Join Date
    Aug 2008
    Location
    Finland
    Posts
    1,578

    Default

    Quote Originally Posted by Kano View Post
    Why is it a mistake to use the max out of existing hardware? ATI is free to add the same extensions.
    It's not as long as you're mentally prepared to do things the Right Way (tm) and do distinct code paths for all vendors - read: Intel, nVidia, ATi, etc. (Like Microsoft's DirectX afaik does; Wine's devs seem to adamantly believe that they can do some uniform solution which is obviously wrong - from what I've heard they seem to believe it should be enough they just do the nVidia codepath and it's on the responsibility of driver vendors to port their drivers. This is very likely to give you reduced performance with non-nVidia hardware even if the driver implementation is as good as with nVidia. Of course, who wants to do three - or five if we count open ATi and nVidia drivers - times as much work?)
    Last edited by nanonyme; 08-05-2009 at 07:54 AM.

  10. #30
    Join Date
    Aug 2007
    Posts
    6,610

    Default

    I am sure that will be done in time for fglrx with opengl 3.2 support. As NV provides now already those test drivers wine could adopt that new codepath and fglrx users will be happy when ATI manages to provide that too. Currently it does not matter in which way the functions are used as they run only on NV hardware.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •