This is truly a game changer for Buncee!" This will bring Buncee's to life for our readers and learners as they now can listen to the digital stories and projects they create and write. Shannon McClintock Miller, K-12 district teacher librarian at Van Meter Community School in Van Meter, Iowa - " Wow! I am so excited about finding the Immersive Reader from Microsoft in Buncee now. We have some early feedback from both teachers and students who have been using the Immersive Reader integrated with partner apps. These partners have enabled immersive reader for students and schools who have been doing early testing of the integration. We’ve been working with these partners during our private preview over the past few months. We’ve made it easier than ever, you might even call it Literacy as a Service!īelow is the first wave of partners that will be enabling Immersive Reader in their apps or service to help make content more accessible and inclusive. If there’s an app you love to use in your classroom and you would love to see an Immersive Reader icon show up there soon, let them know. The Immersive Reader helps people of all abilities, including dyslexia, ADHD, emerging readers, non-native speakers, people with visual impairments. Today, we’re thrilled to announce this powerful literacy tool will now be available as an Azure Cognitive Service, allowing third party apps and partners to add Immersive Reader right into their products, starting with the partners below. Today, more than 16 million people every month are using it for free, improving their reading and writing comprehension and even their ability to do math problems. I hope this save's future googlers some pain.Just a few years ago, the Immersive Reader was a bold idea in a Microsoft Hackathon focused on using the latest science and research around reading while using inclusive design principles to empower students of all abilities. Note: I'm only showing one of my HMD eye views as the spectator view, to show both you would need to store a spectator framebuffer per eye and blit them together side by side. Blit the source buffer to the output buffer Let srcY1 = bufferHeight - (bufferHeight * 0.25) //I crop off the top 25% of the HMD's view Let srcY0 = bufferHeight * 0.25 //I crop off the bottom 25% of the HMD's view Define the bounds of the source buffer you want to use Set canvas view size for the spectator view (Mine was 2:1 aspect, 1280圆40) Store last HMD canvas view size (Mine was 0.89:1 aspect, 2296x2552) _glContext.bindFramebuffer(GL.DRAW_FRAMEBUFFER, null) Set the DRAW_FRAMEBUFER to null, this tells the renderer to draw to the canvas. You don't need to declare the src and dest x/y's as their own variables A tad more verbose that needed to illustrate what's going on. _glContext.bindFramebuffer(GL.READ_FRAMEBUFFER, spectateBuffer) Bind my spectate framebuffer to the webGL2 readbuffer Let spectateBuffer = _glContext.createFramebuffer() Create a buffer for my spectate view so that I can just re-use it at will. a quick reference I like to use for enums and types I personally used a bool showCanvas to allow me to toggle the spectator mirror on and off as desired: Just prior to exiting your OnXRFrame method, implement a call to draw the spectator view.Draw your immersive-xr layer as usual in your xrSession.requestAnimationFrame(OnXRFrame) callback.
create a frameBuffer that will be used to store your rendered HMD view.This is important if your spectator and HMD views are to be different resolutions. When you initialize your glContext, be sure to specify that antialiasing is false.(Mine was fluid, so had a CSS width of 100% of it's parent container with a height of auto)