Hi Tom, list,
This looks like great work. I was hoping to have a look at some multiple rendering context stuff this weekend so I'll let you know how that goes.
An open question: does SDL encapsulate enough glx/wgl/mac os features to support pbuffers and multiple rendering contexts in a platform independent way?
Daniel
----- Original Message ----- From: "Tom Schouten" doelie@zzz.kotnet.org To: zmoelnig@iem.at; "Daniel Heckenberg" daniel@bogusfront.org Cc: pd-dev@iem.kug.ac.at Sent: Thursday, March 13, 2003 9:16 PM Subject: Re: [PD-dev] [GEM] rendering context (gem2pdp)
On Wednesday 05 March 2003 09:17, zmoelnig@iem.at wrote:
Zitiere Daniel Heckenberg daniel@bogusfront.org:
How about this:
- you can name a rendering context in each gemhead and have that
rendering chain render to the context (be it a window, pbuffer or whatever).
-Each gemwin can also be named.
hi daniel, et al.
my plan (which might be influenced too much by other 3d-rendering
software)
was rather not make completely independent rendering-chains (by naming
them
and connecting them via the name to a gemwin) but use the [gemhead]s rendering- chains globally connected to multiple [gemwin]s. The [gemwin]s could be controlled independently with respect to camera/viewpoint, bg-color, size, but also offscreen-rendering. This is really heavily influenced by the "camera"-idea of other
software.
but on the other hand it is a lot of work to be done
hi daniel, iohannes
i decided to do some more experiments with opengl stuff on top of pdp, and this seems to work rather well. it is all centered around render context packets being passed around. if you are interested you can have a look at
the
opengl/ folder in the pdp package. maybe gem could benifit from this,
dunno..
http://zwizwa.fartit.com/pd/pdp/test/pdp-0.11-test-6.tar.gz
it requires glx 1.3 though for pbuffer support. (the only things on linux that have this that i know of are the 41.xx nvidia drivers and mesa 5.0)
right now all the rendering is to a pbuffer. there is a 3dp_context object that provides a context and all 3dp_ objects draw to/manipulate this
context.
on output, the contents of the buffer is dumped into a texture and
displayed.
i chose this approach to have an easy multiple stage rendering chain,
where a
pbuf can be dumped into a texture and reused in another pbuf rendering,
etc..
this also allows to set the window dimensions independently from the
render
buffer dimensions.
i also tried multiple camera views in two different windows, which works
if
you propagate 2 different contexts trough a single rendering chain, and
route
trough a different modelview transform chain in front of the main chain
and
to a different window after the chain. (check the patches in test/).
one note: i have the impression that the rendering context switching
(between
different pbufs and window for example) is a rather expensive operation. i haven't nailed it down yet, but something is causing a lot of extra
cycles..
two note: i don't think i understood context sharing very well. it seems
you
need to explicitly share every pbuf with every other to get them to see
each
other (for copying). now everything is shared from a single mother scratch pbuf.
mvg tom