Results 1 to 2 of 2

Thread: Attempting to stuff a ull into an int

  1. #1

    Default Attempting to stuff a ull into an int

    I found a minor bug while trying to compile in Visual Studio 2012.
    It tells me your trying to store an unsigned long long into an integer space.

    In the file include/ClanLib/GL/opengl_defines.h : line 1041
    Code:
        GL_TIMEOUT_IGNORED = 0xFFFFFFFFFFFFFFFFull,
    An enum is 4 bytes even when compiling for 64bit [at least in VCC].

    The code should be instead:
    Code:
        GL_TIMEOUT_IGNORED = 0xFFFFFFFFu,

  2. #2
    ClanLib Developer
    Join Date
    May 2007
    Posts
    1,824

    Default

    GL_TIMEOUT_IGNORED = 0xFFFFFFFFu would be incorrect.

    It's used for https://www.opengl.org/sdk/docs/man3...glWaitSync.xml ,to specify the 64bit timeout parameter

    Since ClanLib 3.1 (development git) requires c++#11, I guess we could use "enum class" for just that one.
    or a "const unsigned long long GL_TIMEOUT_IGNORED" global

    We try to avoid #defines, since they are not namespace friendly.

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •