Settings

Theme

SDL3 new GPU API merged

github.com

306 points by caspar a year ago · 123 comments

Reader

casparOP a year ago

SDL3 is still in preview, but the new GPU API is now merged into the main branch while SDL3 maintainers apply some final tweaks.

As far as I understand: the new GPU API is notable because it should allow writing graphics code & shaders once and have it all work cross-platform (including on consoles) with minimal hassle - and previously that required Unity or Unreal, or your own custom solution.

WebGPU/WGSL is a similar "cross-platform graphics stack" effort but as far as I know nobody has written console backends for it. (Meanwhile the SDL3 GPU API currently doesn't seem to support WebGPU as a backend.)

  • dxuh a year ago

    Unreal/Unity are not the only solutions. There is also bgfx (https://github.com/bkaradzic/bgfx), which is quite popular and sokol gfx (https://github.com/floooh/sokol) which I know of. Of course there are many more lesser known ones.

    • benlwalker a year ago

      Compared to libraries like bgfx and sokol at least, I think there are two key differences.

      1) SDL_gpu is a pure C library, heavily focused on extreme portability and no depedencies. And somehow it's also an order of magnitude less code than the other options. Or at least this is a difference from bgfx, maybe not so much sokol_gfx.

      2) The SDL_gpu approach is a bit lower level. It exposes primitives like command buffers directly to your application (so you can more easily reason about multi-threading), and your software allocates transfer buffers, fills them with data, and kicks off a transfer to GPU memory explicitly rather than this happening behind the scenes. It also spawns no threads - it only takes action in response to function calls. It does take care of hard things such as getting barriers right, and provides the GPU memory allocator, so it is still substantially easier to use than something like Vulkan. But in SDL_gpu it is extremely obvious to see the data movements between CPU and GPU (and memory copies within the CPU), and to observe the asynchronous nature of the GPU work. I suspect the end result of this will be that people write far more efficient renderers on top of SDL_gpu than they would have on other APIs.

    • jcelerier a year ago

      Qt RHI too. Shaders are normal vulkan-compatible GLSL and then you get D3D11, 12, GL and Metal

    • klaussilveira a year ago
  • bgbernovici a year ago

    I previously integrated bgfx [1], which allows you to write graphics code and shaders once and supports consoles, with SDL2 stack and Swift [2]. It was quite a nice experience, especially for someone who had never worked with any of these tools before. I'm excited for SDL3 as it introduces console abstractions, eliminating the need for additional dependencies for the GPU API. Moreover, Godot officially supports the Steam Deck, and hopefully, more consoles will be supported in the future. On a related note, Miguel de Icaza is advocating for Swift adoption in Godot, and he is working on porting the editor to SwiftUI on iPad. It is interesting to see the progress [3].

    [1] https://bkaradzic.github.io/bgfx/overview.html

    [2] https://github.com/bgbernovici/myndsmith

    [3] https://blog.la-terminal.net/xogot-code-editing/

  • runevault a year ago

    Worth noting Godot also has cross platform shaders. Its GDShader language is based heavily on OpenGL shader language though not a 1:1 copy and gets compiled for the target platform. Though for PS5 and XBox you have to work with a 3rd party (Someone released the Nintendo build for anyone who's signed the Nintendo NDA).

    • jb1991 a year ago

      SDL is also offering cross platform GPU compute, is that also available in Godot?

      • runevault a year ago

        So I haven't used compute shaders though I remembered Godot having them and double checked. Interestingly they are direct glsl which makes me wonder if they only work in OGL contexts. Which would be... weird because Godot 4.3 shipped easy DirectX output support. I'm sort of tempted to test out making a compute shader and compiling to DX and see if it works.

        Edit: Doing more digging according to the end of this forum thread they get compiled to SPIR-V and then to whatever backend is needed, be it GLSL, HLSL, etc.

        https://forum.godotengine.org/t/compute-shaders-in-godot/461...

  • shmerl a year ago

    Why is SDL API needed vs gfx-rs / wgpu though? I.e. was there a need to make yet another one?

    • dottrap a year ago

      The old SDL 2D API was not powerful enough. It was conceived in the rectangle sprite blitting days, when video hardware was designed very differently and had drastically different performance characteristics. If you wanted anything more, OpenGL used to be 'the best practice'. But today, the landscape competes between Vulkan, Metal, and Direct3D, and hardware is centered around batching and shaders. Trying to target OpenGL is more difficult because OpenGL fragmented between GL vs. GLES and platform support for OpenGL varies (e.g. Apple stopped updating GL after 4.1).

      A good example demonstrating where the old SDL 2D API is too limited is with the 2D immediate mode GUI library, Nuklear. It has a few simple API stubs to fill in so it can be adapted to work with any graphics system. But for performance, it wants to batch submit all the vertices (triangle strip). But SDL's old API didn't support anything like that.

      The reluctance was the SDL maintainers didn't want to create a monster and couldn't decide where to draw the line, so the line was held at the old 2D API. Then a few years ago, a user successfully changed the maintainers' minds after writing a demonstration showing how much could be achieved by just adding a simple batching API to SDL 2D. So that shifted the mindset and led to this current effort. I have not closely followed the development, but I think it still aims to be a simple API, and you will still be encouraged to pick a full blown 3D API if you go beyond 2D needs. But you no longer should need to go to one of the other APIs to do 2D things in modern ways on modern hardware.

      • HexDecOctBin a year ago

        I think you are getting confused between SDL_Render and SDL_GPU. SDL_Render is the old accelerated API that was only suitable for 2D games (or very primitive looking 3D ones). SDL_GPU is a fully-featured wrapper around modern 3D APIs (well, the rasteriser and compute parts anyway, no raytracing or mesh shaders there yet).

        • dottrap a year ago

          I was referencing the historical motivations that led to where we are today. Yes, I was referring in part to the SDL_Render family APIs. These were insufficient to support things like Nuklear and Dear ImGui, which are reasonable use cases for a simple 2D game, which SDL hoped to help with by introducing the SDL_Render APIs in SDL 2.0 in the first place.

          https://www.patreon.com/posts/58563886

          Short excerpt:

              One day, a valid argument was made that basic 2D triangles are pretty powerful in themselves for not much more code, and it notably makes wiring the excellent Dear Imgui library to an SDL app nice and clean. Even here I was ready to push back but the always-amazing Sylvain Becker showed up not just with a full implementation but also with the software rendering additions and I could fight no longer. In it went.
          
          The next logical thing people were already clamoring for back then was shader support. Basically, if you can provide both batching (i.e. triangles) and shaders, you can cover a surprising amount of use cases, including many beyond 2D.

          So fast forwarding to today, you're right. Glancing at the commit, the GPU API has 80 functions. It is full-featured beyond its original 2D roots. I haven't followed the development enough to know where they are drawing the lines now, like would raytracing and mesh shaders be on their roadmap, or would those be a bridge too far.

          • HexDecOctBin a year ago

            > where they are drawing the lines now

            From what I understand, they are only going to support features that are widely supported and standardised. Thus, even bindless didn't make the cut. Raytracing, mesh shaders, work-graphs, etc. almost certainly won't make it until SDL4 10 years from now; but I am not part of the development team, so don't quote me.

      • phaedrus a year ago

        Does SDL3 still use integers for coordinates? I got annoyed enough by coordinates not being floating point in SDL2 that I started learning WebGPU, instead. This was even though the game I was working on was 2D.

        The issue is, if you want complete decoupling (in the sense of orthogonality) among all four of:

        - screen (window) size & resolution (especially if game doesn't control)

        - sprite/tile image quantization into pixels (scaling, resolution)

        - sprite display position, with or without subpixel accuracy

        - and physics engine that uses floating point natively (BulletPhysics)

        then to achieve this with integer drawing coordinates requires carefully calculating ratios while understanding where you do and do not want to drop the fractional part. Even then you can still run into problem such as, accidentally having a gap (one pixel wide blank column) between every 10th and 11th level tile because your zoom factor has a tenth of a pixel overflow, or jaggy movement with wiggly sprites when the player is moving at a shallow diagonal at the same time as the NPC sprites are at different floating point or subpixel integer coords.

        A lot of these problems could be (are) because I think of things from bottom up (even as my list above is ordered) where a physics engine, based on floating point math, is the source of Truth, and everything above each layer is just a viewport abstracting something from the layer beneath. I get the impression SDL was written by and for people with the opposite point of view, that the pixels are important and primary.

        And all (most) of these have solutions in terms of pre-scaling, tracking remainders, etc. but I have also written an (unfinished) 3D engine and didn't have to do any of that because 3D graphics is floating point native. After getting the 2D engine 90% done with SDL2 (leaving 90% more to go, as we all know), I had a sort of WTF am I even doing moment looking at the pile of work-arounds for a problem that shouldn't exist.

        And I say shouldn't exist because I know the final output is actually using floating point in the hardware and the driver; the SDL1/2 API is just applying this fiction that it's integers. (Neither simple, nor direct.) It gets steam coming out my ears knowing I'm being forced to do something stupid to maintain someone else's fiction, so as nice as SDL otherwise is, I ultimately decided to just bite the bullet and learn to program WebGPU directly.

        • krapp a year ago

          >Does SDL3 still use integers for coordinates?

          No, they added float versions for most functions and I think they plan on deprecating the int API in the future. The only exception I can think of offhand is still needing an integer rect to set a viewport.

          • phaedrus a year ago

            That's good, then. Honestly an integer rect for the viewport is, "not wrong."

        • shortrounddev2 a year ago

          SDL2 added floating point versions of most rendering functions

      • jandrese a year ago

        I was messing around a bit with SDL2 and either I was doing something wrong or it was just plain slow. My machine is plenty fast, but even just blitting a few dozen PNGs around a screen 60 times a second was pushing its limits. I freely admit I may have been doing something wrong, but I was surprised at just how inefficient it was at a task that we used to do without too much trouble on 1Mhz CPUs.

        Maybe SDL_RenderCopy is the wrong API to use to blit things from a sprite sheet onto a display? The docs didn't give any warning if this is the case.

        • krapp a year ago

          How recent a version were you using? Plenty of games and graphical apps use SDL2 under the hood, and rendering rects from a spritesheet is trivial. Recent versions use the geometry API for rendering rects, so it should be able to handle tons of sprites without much effort.

          • jandrese a year ago

            I'm using SDL2 2.30.0. The main loop is pretty simple, it does a few SDL_RenderFillRects to create areas, then several SDL_RenderCopy where the source is a SDL_Texture created from a SDL_Surface using SDL_CreateTextureFromSurface that was loaded from files at boot. A final call to SDL_RenderPresent finishes it off. They do include an alpha channel however.

            I was expecting the sprite blitting to be trivial, but it is surprisingly slow. The sprites are quite small, only a few hundred pixels total. I have a theory that it is copying the pixels over the X11 channel each time instead of loading the sprite sheets onto the server once and copying regions using XCopyArea to tell the server to do its own blitting.

            • dottrap a year ago

              This should be plenty fast. SDL_RenderCopy generally should be doing things the 'right' way for on any video card made roughly in the last 15ish years (basically binding a texture in GPU RAM to a quad).

              You probably need to due some debugging/profiling to find where your problem is. Make sure you aren't creating SDL_Textures (or loading SDL_Surfaces) inside your main game play loop. You also may want to check what backend the SDL_Renderer is utilizing (e.g. OpenGL, Direct3D, Vulkan, Metal, software). If you are on software, that is likely your problem. Try forcing it to something hardware accelerated.

              Also, I vaguely recall there was a legacy flag on SDL_Surfaces called "hardware" or "SDL_HWSURFACE" or "SDL_HWACCEL" or something. Don't set that. It was a a very legacy hardware from like 25 years ago that is slow on everything now.

            • krapp a year ago

              Whatever the problem is, it probably isn't SDL. Here's a test project I worked on[0], and I'm using a garbage laptop. The sprites aren't that big but if you're just using a single texture it shouldn't matter, since SDL does sprite batching anyway.

              Your theory might be right - the first thing I would look for was something allocating every frame.

              You might ask the SDL Discourse forum and see what they think: https://discourse.libsdl.org/

              [0]https://cdn.masto.host/krappmastohost/media_attachments/file...

          • account42 a year ago

            > Plenty of games and graphical apps use SDL2 under the hoo

            How many of them use the render API though rather than just using SDL to create a window and handle input (and perhaps manage an OpenGL context).

        • account42 a year ago

          Is 60 FPS your screen refresh rate? Perhaps you have VSync enabled.

      • shmerl a year ago

        I see, interesting.

    • badsectoracula a year ago

      SDL provides various "backend agnostic" APIs for a variety of needs, including window creation, input (with a gamepad abstraction), audio, system stuff (e.g. threads), etc so that programs written against SDL can work on a variety of systems - and if linked against it dynamically (or using the "static linking but with a dynamic override" that allows a statically linked version to use a newer dynamic version of the library) can use newer/better stuff (which is sometimes needed, e.g. some older gaming using old version of SDL1.x need the DLL/.so replaced to a new version to work on new OSes, especially on Linux).

      Exposing a modern (in the sense of how self-proclaimed modern APIs like Vulkan, D3D12 and Metal work) GPU API that lets applications written against it to work with various backends (D3D11, D3D12, Vulkan, Metal, whatever Switch and PS5 uses, etc) fits perfectly with what SDL already does for every other aspect of making a game/game engine/framework/etc.

      As if it was "needed", it was needed as much as any other of SDL's "subsystems": strictly speaking, not really as you could use some other library (but that could be said for SDL itself) but from the perspective of what the SDL wants to provide (an API to target so you wont have to target each underlying API separately) it was needed for the sake of completeness (previously OpenGL was used for this task if you wanted 3D graphics but that was when OpenGL was practically universally available for the platforms SDL itself officially supported - but nowadays this is not the case).

    • flohofwoe a year ago

      While WebGPU isn't a bad API, it also isn't exactly the '3D API to end all 3D APIs'.

      WebGPU has a couple of design decisions which were necessary to support Vulkan on mobile devices, which make it a very rigid API and even (desktop) Vulkan is moving away from that rigid programming model, while WebGPU won't be able to adapt so quickly because it still needs to support outdated mobile GPUs across all operating systems.

    • flohofwoe a year ago

      One important point I haven't seen mentioned yet is that SDL is the defacto minimal compatibility layer on Linux for writing a windowed 'game-y' application if you don't want to directly wrestle with X11, Wayland, GTK or KDE.

      • ahartmetz a year ago

        Yeah - getting an OpenGL (and presumably same for Vulkan) context is surprisingly annoying if you don't have a library to help you. It also works quite differently on X11, Wayland, or directly on kernel APIs. Many games that don't otherwise use SDL2 (such as ones ported from other platforms, i.e. most games) use it just for that.

    • modeless a year ago

      The more the merrier if you ask me. Eventually one will win but we need more experimentation in this space. The existing GPU APIs are too hard to use and/or vendor-specific.

      • TillE a year ago

        Writing bits of Vulkan or D3D12 really isn't that bad if you're working within an engine which does most of the setup for you, which is nearly always the case for practical work. If you're doing everything yourself from scratch, you're probably either a hobbyist tinkering or a well-compensated expert working for a AAA game developer.

        • flohofwoe a year ago

          Using something like Unreal or Unity if you just want to blast a couple of triangles to the screen in a cross-platform application is a bit overkill.

        • monocasa a year ago

          If you're targeting SDL, then you probably don't have an engine, or you are the engine.

        • gmueckl a year ago

          Nit: there are also really sophisticated graphics engines for serious applications. It's not only games.

    • throwup238 a year ago

      SDL the library is over a quarter century old. It powers tons of existing software. Why wouldn't people keep working on it?

      • 01HNNWZ0MV43FF a year ago

        They already broke compat for 2.x, and existing games don't have shaders in 1.x or 2.x, right? So why make their own API?

        • badsectoracula a year ago

          Yes and no. SDL 2.x is not backwards compatible with SDL 1.x (and that was an annoyance of mine) but at some point someone wrote an SDL 1.x implementation on top of SDL 2.x that got official blessing, so at least games using SDL 1.x can be made to use SDL 2.x "under the hood" be it in source code form or binary-only form.

          Though you can't take an SDL 1.x game and convert it piecemeal to SDL 2.x as the APIs are not backwards compatible, it is an all-or-nothing change.

          • gmueckl a year ago

            The API breaks in SDL2 were sorely needed, if you asked me. SDL1 painted itself into a corner in a few places, e.g. simultaneous use of multiple displays/windows.

            • badsectoracula a year ago

              I don't think they were needed but i value not breaking existing programs and code more than some abstract and often highly subjective form of code purity.

              The compatibility layer that was introduced a few years later did solve the "SDL1 apps running under SDL2 under the hood (though with some regressions)" compatibility issue, it did somewhat solve the "compile existing code that uses SDL1 with SDL2" (depending on your language and SDL bindings, i had to compile the real SDL 1.2 library to have Free Pascal's bindings work since they didn't work with sdl12-compat) but it did not solve the "updating existing code to use the new features without rewriting everything" compatibility issue (there was even some user in the PR or Twitter asking about future plans for compatibility because he had spent years updating his code from SDL 1.2 to SDL 2.0 and didn't want to repeat the process again - FWIW the answer was that it probably wont be any less than 10 years for a new major backwards incompatible version).

          • anthk a year ago

            SDL2 has a compat library for SDL1:

            https://github.com/libsdl-org/sdl12-compat

    • TinkersW a year ago

      WebGPU would be alot more useful if it hadn't gone with such a needlessly different shader language syntax, makes it much harder to have any single src between the C++ and it.

    • adrift a year ago

      Having a C API like that is always nice. I don't wanna fight Rust.

    • WhereIsTheTruth a year ago

      SDL is for gamedevs, it supports consoles, wgpu is not, it doesn't

      • hnlmorg a year ago

        SDL is for everyone. I use it for a terminal emulator because it’s easier to write something cross platform in SDL than it is to use platform native widgets APIs.

        • westurner a year ago

          Can the SDL terminal emulator handle up-arrow /slash commands, and cool CLI things like Textual and IPython's prompt_toolkit readline (.inputrc) alternative which supports multi line editing, argument tab completion, and syntax highlighting?, in a game and/or on a PC?

          • debugnik a year ago

            I think you're confusing the roles of terminal emulator and shell. The emulator mainly hosts the window for a text-based application: print to the screen, send input, implement escape sequences, offer scrollback, handle OS copy-paste, etc. The features you mentioned would be implemented by the hosted application, such as a shell (which they've also implemented separately).

        • WhereIsTheTruth a year ago

          You are right, I was too focused on the gamedev argument that it made me use an incorrect statement

      • anthk a year ago

        SDL it's great for embedded machines with limited displays.

quadhome a year ago

More context here: https://icculus.org/finger/flibitijibibo?date=2024-06-15&tim...

  • stateoff a year ago

    This article from the main author of the API describes how the buffer cycling works: https://moonside.games/posts/sdl-gpu-concepts-cycling/

    • kvark a year ago

      This rubs me the wrong way. Resource renaming is a high level concept, which was one of the features of OpenGL that all the modern APIs have successfully dropped. A modern low level GPU API should not do that.

      • enqk a year ago

        It’s so convenient to have buffer renaming as an user of a graphics API. What’s not to like about it?

rudedogg a year ago

It's exciting to see how this all shakes out. Hopefully we end up with more options for building custom game engines and apps.

I've been going down the Vulkan rabbit hole. It's been fun/enlightening to learn, but the nature of Vulkan makes progress feel slow. I think if SDL3 were available when I started, I would have happily went that route and have more to show for the amount of time I've invested.

kvark a year ago

Time will tell if this API is usable. In particular, the way resource synchronization works, and object renaming. Time will tell if it will perform better than WebGPU or other abstractions. Time will tell if it remains small in the presence of driver bugs that need working around…

I’m also sceptical about their new bytecode for a shading language. Parsing shaders at runtime was not a concern with WebGPU - it’s very fast. Even producing native shaders is fast [1]. It’s the pipeline creation that’s slow, and this bytecode wouldn’t help here.

[1] http://kvark.github.io/naga/shader/2022/02/17/shader-transla...

ivars a year ago

How did they managed to pull this off so quickly? Given how long WebGPU native is in development and still not finalized, you would think it will take SDL GPU API even longer because it supports more platforms.

  • HexDecOctBin a year ago

    The reason WebGPU took so long was that they decided to write their own shading language instead of using SPIR-V. SDL didn't make that mistake, you bring your own shader compilers and translation tools.

    There is a sister project for a cross-platform shading language [1] and another for translating existing ones between each other [2] , but they get done when they get done, and the rest of the API doesn't have to wait for them.

    WebGPU was made by a committee of vendors and language-lawyers (standards-lawyers?) with politics and bureaucracy, and it shows. SDL_GPU is made by game developers who value pragmatism above all (and often are looked down upon from the ivory tower because of that).

    [1]: https://github.com/libsdl-org/SDL_shader_tools [2]: https://github.com/flibitijibibo/SDL_gpu_shadercross

    • hmry a year ago

      Yeah, legal strikes again. Unfortunately SPIR-V was never going to be an option for WebGPU, because Apple refuses to use any Khronos projects due to a confidential legal dispute between them.[0] If WebGPU used SPIR-V, it just wouldn't be available in Safari.

      See also: Not supporting Vulkan or OpenXR at all, using USD instead of glTF for AR content even though it's less well suited for the task, etc. (Well, they probably don't mind that it helps maintain the walled garden either... There's more than one reason for everything)

      0: https://docs.google.com/document/d/1F6ns6I3zs-2JL_dT9hOkX_25...

      • davemp a year ago

        # Attendance

        ## Khronos

        Neil Trevett

        ## Apple

        Dean Jackson Myles C. Maxfield Robin Morisset Maciej Stachowiak Saam Barati

        ## Google

        Austin Eng Corentin Wallez Dan Sinclair David Neto James Darpinian Kai Ninomiya Ken Russell Shrek Shao Ryan Harrison

        ## Intel

        Yunchao He

        ## Mozilla

        Dzmitry Malyshau

        ## W3C

        François Daoust Dominique Hazael-Massieux

        ## Timo de Kort [sic?]

        ———

        I get that Apple/Google have significantly more resources than most organizations on the planet but if these demographics are representative of other (web) standards committees that’s depressing.

    • grovesNL a year ago

      I don't think that's accurate. Creating a shading language is obviously a huge effort, but there were already years of effort put into WebGPU as well as implementations/games building on top of the work-in-progress specification before the shading language decision was made (implementations at the time accepted SPIR-V).

      • HexDecOctBin a year ago

        The PoC was made in 2016, the work started in 2017, but the first spec draft was released on 18 May 2021. [1] This first draft already contained references to WGSL. There is no reference to SPIR-V.

        Why did it take this long to release the first draft? Compare it to SDL_GPU timeline, start to finish in 6 months. Well, because the yak shaving on WGSL had already begun, and was eating up all the time.

        [1]: https://www.w3.org/TR/2021/WD-webgpu-20210518/

        • grovesNL a year ago

          SPIR-V was never in the specification, but both wgpu and Dawn used SPIR-V in the meantime until a shading language decision was made.

          • HexDecOctBin a year ago

            Sure, but that proves my point. They took so long to decide upon the shading language that implementations had to erect a separate scaffolding just to be able to test things out.

            • grovesNL a year ago

              Scaffolding wasn’t a problem at all. Both used SPIRV-Cross for shader conversions at the time and focused on implementing the rest of the API. The shading language barely matters to the rest of the implementation. You can still use SPIR-V with wgpu on its Vulkan backend today for example.

  • kevingadd a year ago

    The core contributors of the SDL3 GPU project have experience with two cross-platform (PC + consoles) GPU abstraction layers, FNA3D and Refresh, which provided a lot of knowledge and existing open source code to use as a springboard to assemble this quickly with high quality.

  • bartwe a year ago

    No committee and motivated devs that need the result for their projects. Especially the FNA folks.

    • flohofwoe a year ago

      Also tbf, the WebGPU peeps did a lot of investigations for what is the actual set of common and web-safe features across D3D, Vulkan and Metal, and all those investigation results are in the open.

      In that sense the WebGPU project is an extremely valuable resource for other wrapper APIs, and saves those other APIs a ton of time.

      • jms55 a year ago

        Yeah. SDL went the path of "wrap native APIs". WebGPU went the path of "exactly what level of floating point precision can we guarantee across all APIs" along with "how do we prevent absolutely all invalid behavior at runtime, e.g. out of bounds accesses in shaders, non-dynamically uniform control flow at invalid times, indirect draws that bypass the given limits, preventing too-large shaders that would kill shader compilers, etc".

        WebGPU spends a _lot_ of time investigating buggy driver behavior and trying to make things spec-conformant across a lot of disparate and frankly janky platforms. There's a big difference between writing an RHI, and writing a _spec_.

  • account42 a year ago

    Simple: SDL GPU left out the rest of the owl (translating shaders from a common format to API-specific intermediates).

bartwe a year ago

Glad to have contributed to the dx12 part :)

  • HexDecOctBin a year ago

    Bravo, thanks! Since I'll be targeting modern HLSL, your backend is the one I'll be using to begin with. Hopefully DXC produces decent SPIR-V at the end.

  • corysama a year ago

    What resources would you recommend for learning DX12?

Ono-Sendai a year ago

I might try this out. SDL I have found to be high quality software - compiles fast, compiles easily on multiple platforms, always works. So I have some hopes for this new API.

JKCalhoun a year ago

Huge fan of SDL generally.

When I went looking for a cross-platform gaming library, SDL and its API struck the right balance for me. I just wanted a C (++) library I could call to create windows and graphical contexts — a fast sprite rendering framework. I didn't need a whole IDE or a bloated library, didn't want to learn a new language, etc.

immibis a year ago

Feels like SDL3 suffers the second system effect. (SDL2 was just SDL1 with explicit window handles, so SDL3 is the second system, not the third). SDL1/2 is a thin layer that wraps the platform-specific boilerplate of opening a window and handling input events, so you can get to the OpenGL rendering stuff that you actually wanted to write.

  • casparOP a year ago

    If you only want to support Windows/Linux/Android, then sure, you can definitely argue that the SDL GPU API is bloat.

    But if you want to support Apple's operating systems then you're stuck with OpenGL 4.1 (officially deprecated by Apple 5 years ago) - so no modern GPU features like compute shaders.

    You can go the Vulkan route and use MoltenVK for Apple systems, but Vulkan is quite a step up in complexity from OpenGL ("1000 lines of code for a triangle" as people like to say). The goal for SDL3's GPU API is to give you a more approachable (but still plenty flexible) alternative to that.

    And similar story for consoles, presumably.

    Apparently lots of people asked for "SDL_render but can you add shader support that works for all platforms", so that's the origin story.

    SDL3 does also add a higher level audio API - I don't know much about its merits.

    • casparOP a year ago

      Ah, I managed to dig up the original announcement post[0]; relevant snippet:

      > But this is terrible advice in 2021, because OpenGL, for all intents and purposes, is a deprecated API. It still works, it's still got some reasonably modern features, but even if you add up the 22 years Microsoft spent trying to kill it with Apple's seven-or-maybe-twenty, it doesn't change the fact that the brains behind OpenGL would rather you migrate to Vulkan, which is also terrible advice.

      > It seems bonkers to tell people "write these three lines of code to make a window, and then 2000 more to clear it," but that's the migration funnel--and meat grinder--that SDL users are eventually going to get shoved into, and that's unacceptable to me.

      [0]: https://www.patreon.com/posts/new-project-top-58563886

    • hgs3 a year ago

      But why does the GPU API need to be in mainline SDL? Couldn't it be a separate project like SDL_net, SDL_mixer, SDL_image, and SDL_ttf? I would think that as a separate project "SDL_gpu" could be versioned independently, evolve independently, and not be obligated to support every platform SDL itself supports. In fact if "SDL_gpu" only required a windowing context, then it could presumably integrate with SDL2 and non-SDL applications!

      • casparOP a year ago

        AFAICT, if you don't want to use it then you don't have to - just like you didn't have to use SDL_render in SDL2. That is what was pitched by maintainer Ryan Gordon[0][1] at least.

        [0]: https://github.com/libsdl-org/SDL_shader_tools/blob/main/doc... , though the approach that ended up getting merged was an initially-competing approach implemented by FNA folks instead and they seem to have made some different decisions than what was outlined in that markdown doc.

        • gary_0 a year ago

          While using SDL for drawing is optional (and seldom done if you're doing 3D) I would like to add that its drawing API is useful to have out-of-the-box so that new/basic users can get stuff on screen right away without having to write their own high-level graphics engine first.

      • gary_0 a year ago

        See dottrap's comment: https://news.ycombinator.com/item?id=41397198

        SDL needs to be able to render graphics efficiently, but the SDL2 way is no longer sufficient. Since SDL3 is a major version change, it makes sense to overhaul it while a variety of other API-breaking improvements are being made.

    • creata a year ago

      Slightly off-topic, but where's the complexity of Vulkan (the 1000 lines) coming from? My memory tells me that most of the misery is from the window system integration, and that the rest is pretty pleasant.

  • gary_0 a year ago

    SDL2 was not "just SDL1 with explicit window handles". There were a variety of changes and new features all over the API, including (like SDL3) major changes to the graphics subsystem (SDL1 used software rendering, SDL2 added hardware acceleration).

    Also, SDL2 has evolved considerably since 2.0.0, and SDL3 continues that evolution while allowing API-breaking changes. SDL3 is not a from-scratch re-write and as an SDL user I dont anticipate migrating from SDL2 to SDL3 will be that difficult.

    [edit] And SDL1/2 was never so "thin" that it didn't have its own high-level graphics system, which is useful to have out-of-the-box so that new/basic users can get stuff on screen right away.

    [edit2] As ahefner points out, SDL1 was pretty "thin" by modern standards, but it still gave you enough to draw basic stuff on screen without writing your own pixel math, which was pretty helpful back in the 90's.

    • ahefner a year ago

      SDL1 had no high-level graphics system - you either got a raw framebuffer, or an OpenGL context.

      • gary_0 a year ago

        True, now that I think back, all it had was a blit function, and nowadays that's not a graphics system. (But back in the old days, I was impressed that it handled alpha blending for me! Fancy!)

  • flohofwoe a year ago

    The problem is that OpenGL is (pretty much) dead, while Vulkan is a poor replacement for OpenGL when it comes to ease of use.

    • sylware a year ago

      I don't run GL games anymore on elf/linux. And it has been a while. Most cross-platform game engines have a vulkan backend now.

      Very small teams are able to show games running the latest UE5.x engine on native elf/linux, vulkan ("vein", "shmaragon" something).

      But the steam client... is still 32bits and x11/GL hard dependent...

      I still plan to code my own wayland compositor once the steam client is ELF64 and does proper wayland->x11/vulkan->CPU fallbacks. It will feel weird to have a clean 64bits system.

    • shortrounddev2 a year ago

      Yeah, there needs to be a DirectX 11-like API between Vulkan an OpenGL

  • account42 a year ago

    The render API was already needless bloat for many SDL users. SDL2 was also significantly larger than SDL1 in binary size already.

    This abstraction at least has the potential to fulfil the needs of anything more than simple 2D games while allowing you to target the sadly increasingly fragmented graphics API ecosystem (RIP dreams of an universal OpenGL(Next) future). Looks like the hardest part (shader translation) isn't there yet though.

jb1991 a year ago

I’ve never used this library before, but I’m very interested to see some examples of its cross-platform GPU compute abilities, if I understand from the link thread that they are now available. Does anyone have a suggestion on where to get started?

JoeyJoJoJr a year ago

I’d love to see Raylib get an SDL GPU backend. I’d pick it up in a heartbeat.

bni a year ago

Is this related to https://github.com/grimfang4/sdl-gpu ? Or is it a completely separate thing with the same name?

  • dottrap a year ago

    This is a separate thing with the same name. Although both share some common ideas. The grimfang4/sdl-gpu is a separate library used with SDL, while the new SDL GPU API is directly part of SDL. grimfang4/sdl-gpu is much older and works with today's SDL 2.

    The grimfang4/sdl-gpu was one good way to take advantage of modern GPUs in a simple way and workaround the holes/limitations of the old SDL 2D API. The new SDL 3 GPU API will likely make the need for things like grimfang4/sdl-gpu redundant.

davikr a year ago

Are there any examples?

ammar-DLL a year ago

i'm looking forward to wayland native support

kookamamie a year ago

Sorry, but the proposal for the included shading language looks pretty braindead to me.

See for yourself: https://github.com/libsdl-org/SDL_shader_tools/blob/main/doc...

Deviations from C-language families, such as "Flow control statements don't need parentheses." are completely unnecessary, I think. Same goes for "Flow control statements must use braces."

  • e4m2 a year ago

    The current SDL GPU API does not intend to use this shader language. Instead, users are expected to provide shaders in the relevant format for each underlying graphics API [1], using whatever custom content pipeline they desire.

    One of the developers made an interesting blog post motivating this decision [2] (although some of the finer details have changed since that was written).

    There is also a "third party" solution [3] by another one of the developers that enables cross-platform use of SPIR-V or HLSL shaders using SPIRV-Cross and FXC/DXC, respectively (NB: It seems this currently wouldn't compile against SDL3 master).

    [1] https://github.com/libsdl-org/SDL/blob/d1a2c57fb99f29c38f509...

    [2] https://moonside.games/posts/layers-all-the-way-down

    [3] https://github.com/flibitijibibo/SDL_gpu_shadercross

    • kookamamie a year ago

      Thanks for the clarification. From the sparse documentation of SDL_GPU it was somewhat difficult to understand which parts are part of the SDL 3 merge, and which parts are something else.

      I did find an example of using the GPU API, but I didn't see any mention of selecting a backend (Vk, etc.) in the example - is this possible or is the backend selected e.g. based on the OS?

      • e4m2 a year ago

        > is this possible or is the backend selected e.g. based on the OS?

        Selected in a reasonable order by default, but can be overridden.

        There are three ways to do so:

        - Set the SDL_HINT_GPU_DRIVER hint with SDL_SetHint() [1].

        - Pass a non-NULL name to SDL_CreateGPUDevice() [2].

        - Set the SDL_PROP_GPU_DEVICE_CREATE_NAME_STRING property when calling SDL_CreateGPUDeviceWithProperties() [3].

        The name can be one of "D3D11", "D3D12", "Metal" or "Vulkan" (case-insensitive). Setting the driver name for NDA platforms would presumably work as well, but I don't see why you would do that.

        The second method is just a convenient, albeit limited, wrapper for the third, so that the user does not have to create and destroy their own properties object.

        The global hint takes precedence over the individual properties.

        [1] https://wiki.libsdl.org/SDL3/SDL_HINT_GPU_DRIVER

        [2] https://wiki.libsdl.org/SDL3/SDL_CreateGPUDevice

        [3] https://wiki.libsdl.org/SDL3/SDL_CreateGPUDeviceWithProperti...

        • account42 a year ago

          > The global hint takes precedence over the individual properties.

          This seems like a bad design - when I explicitly pass something to a function I expect it to be honored and not overwritten by some global state, especially one that can come from an environment variable.

          I'm not even sure how a hint or a null parameter makes sense at all here since the program will be responsible for passing the shaders in the correct format (which isn't even checked outside of debug mode lol). There also doesn't seem to even be a way for the application to check what shader format is supported by the mystery device it was handed against its wishes, outside of getting the name and then mapping that back to supported shaders which may or may not change in the future.

          Having two entry points for device creation with widly different argument types (one using flags, one using string-based properties with comically long names you might find the Java world) is also not something I would have expected in a newly designed API - that kind of uglyness is usually the result of changing requirements that the initial entry point did not forsee.

  • BigJono a year ago

    Deviating from conventions to avoid footguns is so misguided. I've been writing C family languages for like 15 years and never once accidentally done a if (foo); whatever;

    The convention itself IS the thing that stops you from fucking that up. It's the kind of thing you do once 2 days into a 30 year career and never again.

    I still think it's dumb in Javascript, where you could be using the language on day 2 of learning programming. But in a GPU shader language that it would be almost impossible to understand with no programming experience? It's actually insane.

    Having said that everything else about this project looks pretty good, so I guess they can get a pass lol.

  • mahkoh a year ago

    If control flow statements don't require parentheses to be parseable, doesn't that mean that it is the parentheses that are completely unnecessary?

  • dkersten a year ago

    I, on the other hand, find the C way brain dead and would be very happy with these changes.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection