Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Fortunately you can still use Vulkan on iOS and Mac OS through MoltenVK[1], a Vulkan implemention in Metal.

[1] https://github.com/KhronosGroup/MoltenVK



Sure, but the whole point of both Vulkan and Metal is to bring out more performance by being lower-level. I'd assume that at least part of that benefit is lost when you use something like this.


A lot of the point of Vulkan is not having to rewrite your entire graphics stack for every OS you want to target.


But you have to rewrite it for every major hardware vendor, or else you won't get the performance you want.

GL should always work for the simple case, but instead you need to rewrite to avoid bugs in its many layers. And once you have Vulkan/Metal, industry-specific wrappers are better than the impossible to debug procedural GL junk.


I'm not sure I agree with the claim, but even if we take a full rewrite at face value, "every major graphics vendor" for desktop applications is NVIDIA and AMD/ATI. On mobiles, you're probably using Unity or similar middleware and therefore not thinking about bare metal (no pun intended)


Actually it is more like every major GPU family, thanks to the extension fest of Vulkan.


That's the point of OpenGL; to my understanding, Vulkan is a modernized, lower-level spiritual successor to OpenGL.


Isn't that kind of the point of OpenGL?


Yes, but OpenGL is so outdated that the people that should use it the most (game and 3D application developers) were avoiding it due to a hardware vs. API incompatibility.

Vulcan was created to get that same portability, with an API that fits modern hardware.


That isn't remotely true. OpenGL is only outdated on macOS where Apple hasn't updated it for 8 years.

OpenGL 4.6 isn't anything like OpenGL 1.0/2.0 even though you can still _run_ those old OpenGL 1.0/2.0 tutorials.

You can even do most of the cool stuff of Vulkan in OpenGL via AZDO techniques (example: https://developer.nvidia.com/opengl-vulkan )


> OpenGL is only outdated on macOS

Also on Windows.

Apparently, if you want to distribute your software to wide audience, you can only rely on OpenGL 3.0, with minimal set of extensions. Here’s an example: https://github.com/Const-me/GL3Windows#building-and-running

All the target systems had latest Windows updates, and they all run Direct3D 11 software just fine (I mostly develop for D3D and I test on them). On some systems it works in 10.1 compatibility mode, MS calls that “feature levels”. Not a big deal in practice, the majority of D3D11 stuff still works OK.


No, you're getting stuck at 3.0 because you're hitting the deprecation strategy. You need to specifically request a post-3.0 context with wglCreateContextAttribsARB which you're not doing. Thus the system thinks you're an old legacy OpenGL app, and is giving you 3.0 as that was the last version before things were removed.

See: https://www.khronos.org/opengl/wiki/Tutorial:_OpenGL_3.1_The... for a tutorial.

My Pascal-based Nvidia GPU is showing OpenGL 4.6 on Windows 10. Nothing outdated here.


> No, you're getting stuck at 3.0 because you're hitting the deprecation strategy.

I think you’re wrong here. Two reasons.

1. If that would be the case, I would be stuck with GL3.0 regardless on the GPU. In reality, I’m only stuck with GL version < 4.0 on HD2000 and VmWare. On my desktop PC (Maxwell at the time I’ve wrote that demo) OpenGL 4.0 worked just fine in that very project. Even on Intel HD 4000 laptop, OpenGL 4.0 worked just fine with the same code.

2. Please read Intel’s documentation: https://www.intel.com/content/www/us/en/support/articles/000... Specifically, please expand “2nd Generation Intel® Core™ Processors” section. As you see in that table, Intel says HD Graphics 3000/2000 only support OpenGL 3.1, which is exactly what I’m getting from the GLEW library I’m using in that project.

Also, you can see in that article that no intel GPU supports GL 4.6 mentioned by GP. Even the latest generation UHD Graphics 620/630 only support GL 4.5. Meanwhile, they support the latest DirectX 12 for several years already.


> 1. If that would be the case, I would be stuck with GL3.0 regardless on the GPU. In reality, I’m only stuck with GL version < 4.0 on HD2000 and VmWare. On my desktop PC (Maxwell at the time I’ve wrote that demo) OpenGL 4.0 worked just fine in that very project. Even on Intel HD 4000 laptop, OpenGL 4.0 worked just fine with the same code.

Behavior depends on if the device supports 3.2+ compatibility mode which is optional.

You're hitting the legacy path, that's well-defined ( https://www.khronos.org/registry/OpenGL/extensions/ARB/WGL_A... ). You need to use the method I mentioned to get real post-3.0 OpenGL.

> Also, you can see in that article that no intel GPU supports GL 4.6 mentioned by GP. Even the latest generation UHD Graphics 620/630 only support GL 4.5. Meanwhile, they support the latest DirectX 12 for several years already.

ok, so? 4.5 isn't really outdated, either. It still supports all the modern good stuff. And, as we've established at this point, it's not Window's stopping you from leveraging the full extent of the hardware you have. By contrast macOS does stop you from using the hardware you got to the fullest, as it's stuck on 4.1


> depends on if the device supports 3.2+ compatibility mode which is optional.

For the systems I have in this house it’s not required, i.e. I’m getting the same OpenGL version that’s advertised by the GPU vendors.

> You need to use the method I mentioned to get real post-3.0 OpenGL.

Either I don’t, or the authors of that GLEW library https://www.opengl.org/sdk/libs/GLEW/ already did that. When running on modern GPUs, the code in my repository already uses real post-3.0 OpenGL just fine, including the shaders.

> ok, so? 4.5 isn't really outdated, either.

Right, but 3.1 (Intel Sandy Bridge) is. And 4.0 is outdated, too (Intel Ivy Bridge). Meanwhile, modern Direct3D works fine on these GPUs, 11.0 feature level 10.1, and native 11.0, respectively.


> Either I don’t, or the authors of that GLEW library https://www.opengl.org/sdk/libs/GLEW/ already did that. When running on modern GPUs, the code in my repository already uses real post-3.0 OpenGL just fine, including the shaders.

Go read the extension I linked, it explains the behavior you're seeing. Also go read the tutorial I linked, it's using GLEW and shows you how to create a context.

You have a bug if your intention is to get a post-3.0 OpenGL context. Whether or not you care is up to you. You may be perfectly happy being in the compatibility bucket. I don't know. But you're not in the explicit 3.1 or later path.

> Right, but 3.1 (Intel Sandy Bridge) is.

Sandy Bridge is a 7 year old CPU. Of course it's outdated...? And D3D 10.1 is from 2007, it's also hugely outdated. You're getting anything more modern out of the hardware with D3D than you are OpenGL here. I don't even know what the argument you're trying to make is at this point.


Just a nitpick, OpenGL 4.0 is newer than DirectX 11.0 and on par in capabilities.


No. Both ATI and Nvidia drivers include recent OpenGL versions, so OpenGL support problems are limited to actually not capable hardware.

In the old link you offer as example, Intel HD3000 and HD4000 are bad, with bad drivers that lie about OpenGL versions (hence the need to downgrade the client), and fortunately obsolete. Current Intel integrated graphics have improved. And VMware is a virtual machine, not hardware; it should be expected to be terrible.


> Intel HD3000 and HD4000 are bad, with bad drivers that lie about OpenGL versions

Technically that’s probably true. However, if you drop support of Intel GPUs, your GL4+ software will no longer run on a huge count of older Windows laptops people are still using. For many kinds of software this is a bad tradeoff. That’s exactly why all modern browsers implement WebGL on top of Direct3D, and overwhelming majority of multi-platform games and 3D apps use D3D when running on Windows.

> VMware is a virtual machine, not hardware; it should be expected to be terrible.

It’s only terrible for OpenGL. The virtual GPU driver uses host GPU to render stuff, and it runs D3D11-based software just fine. I don’t use it for gaming but it’s nice to be able to use a VM to reproduce and fix bugs in my software caused by outdated OS, windows localizations, and other environmental factors.


That's not why they do that at all. They don't need anything recent from OpenGL or Direct3D, which is why they target DX9. And DX9 specifically is targetted because that also works on XP, which D3D10 doesn't.

Intel GPUs D3D drivers have historically been better than their OpenGL ones (which isn't saying much since their D3D drivers are also trash), but now we're talking driver quality of one player which has nothing to do with the API itself or opengl somehow being outdated on windows.

But ANGLE also targets desktop OpenGL (and vulkan), and as OpenGL 4.3 adoption increases I'd expect increasingly more browsers to use it for WebGL 2.0 since you don't need translation there at all. OpenGL 4.3 provides full compatibility with OpenGL ES 3.0.

You seem to be pretty confused on how OpenGL versions line up with the D3D ones, too. For reference OpenGL 3.1 is roughly equivalent to D3D 10.1. When you're complaining about only getting GL 3.1, you're also complaining about being stuck with D3D 10.1


Yeah, but that software won't run inside Windows containers, like the store, or work with the Visual Layer Engine in W10.


You know that OpenGL is an API standard, and not a piece of software, right?


Yes? What does that have to do with anything that I've said?


It's still way faster than opengl in some senarios


Way faster in most scenarios actually. I'll give it that. (Any scenario I've tested anyway.)

But it's also more difficult for the lay programmer to use as well.


Just use a vulkan implementation of opengl.

OpenGL -> Vulkan -> Metal

I can feel the performance gains already.


Why is Apple investing in Metal at all then?


Same reason Microsoftinvests in DirectX. Lock in software developers and consumers alike.


I can kind of understand iOS, but it’s not like there’s a thriving graphical computing market worth locking in on the mac side. All major titles already use game engines. They’d just be locking out smaller developers who can’t invest in porting all their shaders: it’s not gonna be worth the effort.

Granted, this may also be true for OpenGL.


But iOS is a lot bigger than macOS.

You notice that Apple supported OpenGL while they were making most of their money from desktop and laptop sales; but once iOS became so profitable, they decided to go their own way and start pushing Metal.

Lock in, or at least getting people to do more iOS first development (helped by lower profits on app store sales on Android, Android fragmentation, etc), helps Apple out a lot. You get the first version of apps and games, or more polished versions of apps and games, on iOS this way.


So why lock in macOS at all?


Because the developers who would be working on OpenGL on macOS are working on Metal instead, because that's where the value is for Apple.


Maybe.... why would you assume they continue developing for macs at all? Small studios might not have the resources, and the market is tiny for many apps, eg indie games, modeling software, and ML (to be fair, apple has repeatedly emphasized they don’t care about ML on the desktop by not offering nvidia cards...).

And again, I don’t see the benefit for apple over supporting cross platform apis to encourage development. It seems like a net loss for everyone but some line in their budget on driver maintenance.


They do make some money on Macs, and Mac software, but not nearly as much as on iOS.

Providing macOS gives a developer and designer platform for iOS. That is really important for them. So Metal being available on macOS is important for that reason. But it's also important in that the Mac platform is still important, just not nearly as important as iOS.

OpenGL doesn't really have much of a future. Everyone is moving towards the next generation frameworks. It just happens that there was a lot of uncertainty about whether OpenGL could adapt or whether there would be a successor, and during that time Apple decided to invest in developing Metal. It wasn't until a couple of years later than Vulkan was released.

In the meantime, Apple has built up quite a lot of tooling around Metal.

And it's not like it's that difficult to write cross platform apps that target the Mac. If you write against major engines, they will already have support for the different backends. If you are writing your own software, you can still target OpenGL, or you can target Vulkan and use MoltenVK to run it on macOS.

And for the next several years, people writing portable software are going to have to either just target OpenGL, for compatibility with older graphics cards, or maintain at least two backends, OpenGL and Vulkan. Given that lots of games target DirectX first, and consider any of the cross-platform frameworks a porting effort, Apple probably doesn't consider it a big loss to add one more platform that needs to be ported to.

What's going to wind up happening is more software like ANGLE (https://github.com/google/angle), MoltenVK (https://github.com/KhronosGroup/MoltenVK), and gfx-rs (https://github.com/gfx-rs/gfx and https://github.com/gfx-rs/portability, see http://gfx-rs.github.io/2018/04/09/vulkan-portability.html for details) for providing one set of abstractions on top of several different backends.


Wouldn’t they be better off attracting developers instead of locking them in?


They could easily jump ship if that isn't the case. If you're working on Metal, your skills aren't much of a use in Microsoft.


Tailored to their hardware, more modern and is written in Objective-C which makes it much easier for Mac developers to integrate in their projects, since Objective-C interface nicely with Swift and most script languages.


Metal is a C++ API, not Objective C.


Ditto for OpenGL ES (http://moltengl.com/)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: