Experiencing Cosmic Rays with Blender in a Fulldome

Posted on by Community Reporter

spacer Dalai Felinto, Mike Pan and Martins Upitis use Blender in a real-time setup with sub-atomic particle detectors and a full-dome projector. You can see their work at the Cosmic Sensations event in Nijmegen, the Netherlands from Sept 30 – Oct 2, and they’re giving away free tickets!

(By the way: it was good to meet you last night guys, I hope to see you again at the BConf 2010!)

Dalai Felinto writes:

Hello BlenderNation,

we are here to present the current Blender Game Engine project we are working on and to invite interested artists to see it live. Attend the Cosmic Sensation event that happens from September 30th to October 2nd in Nijmegen, the Netherlands. For more information please visit: www.cosmicsensation.nl. The tickets are pretty affordable, nevertheless we will be giving away 27 tickets for the first blender heads who contact us into the email [email protected]

Our team is an international join force between artists, technical artists and coders. The project started one year ago with the architect Dalai Felinto (dfelinto – Brazil). During this course of time a lot of effort was put into making sure Blender 2.5 had a Blender Game Engine up and running. In the Open Software world that translates to a lot of committed code and reported bugs. With the bases covered, the digital artist Mike Pan (mpan3 – Canada) and Martins Upitis (martinsh – Latvia) joined the project for an one intensive month of creative work.

The project is called Cosmic Sensation. It’s held by the high-energy physics experimental department in the Radboud Universiteit Nijmegen. Led by the professor Sijbrand de Jong, their research discovered new ways to detect sub-atomic particles, in particular one of high energy frequency known as Muon. Given its  characteristics, Muons don’t have its trajectory affected after its separation from the Proton nuclei. Therefore being able to detect Muons and to trace their directions can lead to important studies on the origin of the protons emitions and their direction. The original study was published in the Science magazine. However the scientist team wanted to reach a larger audience. Therefore they came up with the idea of the Cosmic Sensation event.

For three nights the largest immersive fulldome in the world will be the stage for Cosmic Sensation. This event is a blend of dancing, music and visuals in a way never seen before. Real particle sensors, installed onto the 30 meter dome structure, will trigger music and visual effects whenever a new cosmic ray hits the dome. This realtime feed of events not only produce procedural music (superimposed at the DJs work) but is also responsible to feed an interactive digital visualization projected inside the dome. As you could have guessed, Blender Game Engine is the technology behind that part.

We don’t want to spoil the surprise, but the Blender visualization is an artistic interpretation of the particles movement along the dome with fancy visual effects. We are using the whole dome as a canvas for digital projection, receiving the realtime data from the sensors (through OSC), and using the fulldome/fisheye mode (slightly patched) to project into the correct stitching.

For those unable to attend the main event and interested to hear more about this project we are going to talk about it in the Blender Conference 2010. For in November, passed the presentation, we will be releasing the files in a CC license and photos and footages from the project. Stay tuned!

Related Links:

  • For more information and tickets (€5 to €7.5 full entrance, half for students) - www.cosmicsensation.nl
  • “Blender and Immersive Gaming in an Hemispherical Dome” – Proceedings of the Computer Games & Allied Technology 10 (CGAT10), Research Publishing Services.
  • “Blender Games in the iDome” – BlenderNation online article.
  • “Immersive Domes and Blender Game Engine” at the Blender Conference 2009 – Proceedings – Video
  • www.cosmicsensation.nl/blender – Hot Site with the downloadable files, pictures, videos and tutorials. To be released in November.

* News about the project and more inside informations on the team blog’s.

This entry was posted in Animation Festivals, Games, Video by Community Reporter. Bookmark the permalink.
Be Sociable, Share!
    • spacer
    spacer
    • Tweet

    10 thoughts on “Experiencing Cosmic Rays with Blender in a Fulldome

    1. spacer Vending on said:

      First! -and a ++ for the dome projection :D

      Log in to Reply
    2. spacer dwayne on said:

      awesome. Being that I’m in a physics Major right now, with a tour of the lab tonight, this is perfect timing! Wish I was in Europe.

      Log in to Reply
    3. spacer faxrender on said:

      Great!
      Unfortunately there is still no dome projection rendering in Blender (fisheye camera), neither in Luxrender..

      Log in to Reply
    4. spacer Dalai on said:

      Oh I noticed a typo (my fault) the email is actually [email protected]
      Could you change that Bart?

      And thanks for posting the news and it was indeed great to see you there. We are meeting up soon for sure :)

      Log in to Reply
    5. spacer Ron Proctor on said:

      @Dalai: Very cool. Congratulations and best of luck!

      Also: what is the effective resolution of the display? Assuming it’s multi-projector — how do you get Blender to split the live video output for the individual projectors?

      @faxrender: While fisheye is not natively supported, you could use our fisheye camera rig. We’ve produced a number of planetarium shows this way. (Works in 2.49 and 2.5)

      Here’s the current link:

      www.planetarium.net/group/blendheads/forum/attachment/download?id=2096069%3AUploadedFile%3A13314

      Log in to Reply
    6. spacer Pawel on said:

      Quite incredible!

      Log in to Reply
    7. spacer Coby Randal on said:

      I am definitely interested in seeing how they control Blender with musical instruments. Are they using Midi with the game engine?! That would be fantastically cool, even giving game designers an extra arsenal of input possibilities using midi controllers.

      Just imagine, you’re bug-testing and polishing a game, tweaking property values while the game is running (if that’s possible) by placing the mouse cursor over an enemy spawn-point or something, and then turning a knob or raising a midi controller switch to adjust the property levels. It could be loads of useful.

      From a purely musical Visual DJing standpoint, there are tons of options. With so many different kinds of input available through midi devices, that could trigger any number of customized effects and light shows, all running in real-time on Blender’s game engine.

      Just a few input possibilities:
      Velocity (of a keyboard key being pressed)
      Note
      Modulation Wheel
      Pitch Bend Wheel
      Aftertouch
      Sustain Pedal
      Sliders and knobs
      Switches
      Velocity Sensitive rubber drum buttons

      Log in to Reply
    8. spacer Dalai on said:

      @Ron: we are using one output split by two 1920×1200 (so we have an effective 3840×1200 output). This get projected to a 360ºx40º fov (field of view). So in the end we half aproximately a 1/2 inch pixel. It’s not so bad, it does look good in the dome.

      Blender is connected to two stitching machine responsible to control all the input (videos, blender live stream and the other VJ’s live stream work) and to handle the blending, color balance, … There are using a commercial software that wasn’t originally made for domes, but for love events. I believe they will officially support domes in their next versions (so far a lot of experimentation is going on here.

      There were some challenges on the stitching from the Blender part. As I write here I’m implementing one extra “dome” mode specifically for this dome, in order to gives us the better resolution with a optimized performance.

      @Coby: the sensors are sending Midi events to a middleware that converts midi into OSC signals (to broadcast over the different controllers (audio, visual, blender, lights, …). Blender listen to the OSC and indeed is directly controlled by those events. In the last Blender Conference I saw at least two works doing exactly this OSC/BGE integration, so it’s quite straightforward. You can find the videos and proceedings online.

      Log in to Reply
    9. spacer lucio on said:

      i looved the idea.

      Log in to Reply
    gipoco.com is neither affiliated with the authors of this page nor responsible for its contents. This is a safe-cache copy of the original web site.