Issues with Render to Texture #1366
-
Hello, I've got problems getting a render to texture scene to render. So far I have an osgViewer with a default camera. If I attach my root node with the glsl_simple.osgt scene to the osgViewer, everything renders just fine. I then tried to add a render to texture camera to the root node, attach the scene to be rendered to it and add a canvas geometry to the root node. This is the code I use to generate the camera:
This is the code I use to create the canvas geometry:
And I use this function to set the entire scene up:
And eventually, I configure the osgViewer like this in the constructor of the app:
With this, I can actually see the canvas, which, however, is coloured plain red (the clear colour of the rtt-camera). The animated scene, however isn't visible on the canvas geometry. What did I do wrong? I tried to stick to the osgprerender and osgdistortion examples, but they are unfortunately a rather maximum minimum example. I further think that I haven't understood how cameras in OSG work, because I first tried to set the rtt-camera as a normal camera that should display the canvas (in plain white). So I just set the camera up like this:
and added the rttCanvas as a Child. All I could see was the red clear colour, but no white geometry. How does OSG behave if there are cameras present in a subgraph? Could you please give me a hint? Thank you |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Damn it... I've got the render to texture working by chaning But I'd still be interested if my mental model is wrong or not regarding "chained cameras" in a graph. Thank you |
Beta Was this translation helpful? Give feedback.
Damn it... I've got the render to texture working by chaning
cam->setReferenceFrame(osg::Camera::RELATIVE_RF);
to
cam->setReferenceFrame(osg::Camera::ABSOLUTE_RF);
But I'd still be interested if my mental model is wrong or not regarding "chained cameras" in a graph.
Thank you