Thread with 20 posts

jump to expanded post

this is of course about touchHLE, which really needs the ability to share framebuffers between opengl contexts in some form, but which i've never been able to get working until now, so i've had to resort to incredibly horrible workarounds that'll only get worse in future

Open thread at this post
  1. first frame, drawn with first context, which creates the “GL” texture
  2. second frame, drawn with the second context, which uses the same texture
  3. third frame, drawn with the first context, same texture again
  4. fourth frame, drawn with the second context, same texture again
Open thread at this post

I mean. maybe using glReadPixels() is fine. these are ancient games, surely they can finish rendering in well under 16ms…? I'm not doing major rendering after that, just compositing. OTOH I shudder at the thought of sending a 4K framebuffer from VRAM to system RAM and back…

Open thread at this post
Adrián Pérez , @aperezdc@mastodon.social
(open profile)

@hikari IIUC you are trying to use the result from rendering in one context in another one, right? And this is all in the dame process. Does the process doing the rendering need a window surface, or does it do it somehow offscren? I think you may be able to get away with taking care yourself of buffer allocation and using that as target texture for a FBO, and the same buffer should be usable in the other context as well. The main thing to look for is glEGLImageTargetTexture2DOES

Open remote post (opens in a new window)
Adrián Pérez , @aperezdc@mastodon.social
(open profile)

@hikari the caveats are:

- Either you glFinish on the context that paints before using the buffer as texture elsewhere for rendering (potentially slow), or arrange using fence objects for syncing (potentially gnarly).
- What can be used as buffer to bind as render target for the FBO depends on the EGL platform: GBM buffers on Linux for most drivers, AHardwareBuffer on Android, IOSurface on Apple (I think, don't get my word for it) and so on.
- You probably need two (front/back) buffers

Open remote post (opens in a new window)

@aperezdc

IUC you are trying to use the result from rendering in one context in another one, right? And this is all in the dame process.

Yes, that's right.

Does the process doing the rendering need a window surface, or does it do it somehow offscren?

The first rendering is offscreen, resulting in an OpenGL renderbuffer. Then I want to use that renderbuffer's content in another context so I can composite it with other stuff and output it to the window.

Thanks for the suggestions related to EGL, that seems like a workable solution.

Open thread at this post
Adrián Pérez , @aperezdc@mastodon.social
(open profile)

@hikari you're welcome, and at any rate I hope you manage to get things working in a way that feels satisfactory to you, regardless of whether my suggestion ends up being useful or not. While I haven't been much into the iOS ecosystem, I think it's great that there are folks out there poking at it in ways that would enable software preservation—happy hacking!

Open remote post (opens in a new window)