Week 13 After

Welcome to Jersey, a beautiful land… park? Well, it was beautiful, although upon my first encounter: rainy, cloudy and cold. Especially the subsequent day once it was freezing cold and windy and constructing everything was taking long. So long. So did the deconstruction. But constructing it was much more of an effort than expected.

Fingernails busted, totally unprepared, late-nights at the school to finish wrapping up semester loose ends. Eventually leading up to a very windy and a cold day on which we had quite some fun constructing the dome once again. Think about it, if the wind blows and you’re constructing a dome that is lightweight, is made of wood and material that just carries wind, then it’s a bit of a problem isn’t it? With us repositioning the dome and trying to keep it steady in so many ways with sandbags, bags and other provisional material, it all came down to a satisfactory finish as it was raised and ready to go. I thought it would be a bit warmer within the dome at first but the wind was so strong that I really got stone cold and tired to the point I just took a sit and… fell asleep for a while.

In any case, eventually we managed to set up the projector and calibrate all to work properly. It took longer than expected to get it all up and running and so by noon, we decided to get it going, as the projection spanned across most of the Dome and the calibration seemed to have reached its limits, especially that incoming and exterior light remained an issue we had to combat in a variety of ways.

Oh, also I forgot to mention, we had bagels and coffee. Although I had donuts on my way to the spot.

Image may contain: tree, sky, plant, flower, outdoor and nature

Once it was all dun, I figured my watch has ended and so I decided to check on other tents and pass on the torch of other group members handling the operation. While it could feel a bit strange to do this way, I could just not simply remain in the same spot as I was reaaally cold. So I visited a few other tents, managed to eat some protein bars and pretty good tacos and then I realised that our dome was generating traffic. And that was a good sign. One that would remain as such up till the closing hours around 3, when our team kept on saying that we’re wrapping up but 10 more sessions took place as people just wanted to see it. Considering the mapping was for about 3 minutes and the tent would take about 10 people in-n-out, I would say about 500 people might have come across the projection mapping.

At a later stage I would inspect the dome and it look relatively good, the problem was the projection being skewed but the main components worked and made people in awe. I guess that’s the most satisfying part of all the work done in past weeks.

Week 12 – Before

This happy face indicates – gotta use vpn, for some reason!

So, beginning of this week marked 5-minute presentations. These were quite indicative but, more or less, people went through what they went through. I, kinda hesitated. And in fact, I had an around 4-day break. Now that I think about it, it’s quite kinda funny although coming to look at the Samsung Health app with 19.5h of sleep recorded and then another 13 hours the next day. Probably not healthy but eventually getting some sleep ought to be. Now that I think about, it’s kind of a good time to reflect on actually how much work was involved in the last weeks. As my rescuetime ain’t rescuing anyone or anything but it’s pretty good at keeping track of what I do digitally and whether I’m wasting my time or actually (at least) trying to put it into a productive mode.

And so while it is quite challenging to exclude Ed Tech on its own and count how many hours were put towards it, I’d still say there would be a decent amount of ~25-35 hours weekly, with some turbulence to add or subtract, significantly at times. And then to top it off with ~2 hours of travelling back and forth to the school, by taking the bike I’d be faster and healthier, by taking public transit I’d be toppling off with good over 2 hours and less on my kangaroo bank account. But sometimes it’d be cold or rainy and the choice would become all of a sudden, narrower. In any case, it is really quite a nice chunk of time and effort and the school does tend to take all of it, once I look back.

Playtech Week

Can’t say all these hours were purely productive but as an indicative form-factor, it works and it’s nice to see the hardware I decided to invest in prior, remains actually surprisingly stable and reliable, pulling off anything I really required from it (here goes a hail to Microsoft). Other than that, why am I mentioning all the work that also was outside of the class? Because many of the skills I learned from outside I managed to apply here. Disregarding Rhino that I learned back 4 years ago, I got a refresher here at Parsons. Some of the interaction methods were based on coding that I either learned or self-learned to apply in this project. Many of the experimentation methods, motion capture. Technically, with each week I see I could potentially undertake much more but there as many hours we get to work on, well, work.

So, moving forward, what do we got, what do we do, what have I done? Well, I conceived a presentation into which I threw a few cools toys, some of the processes that I felt were worth mentioning and that perhaps were not broached earlier. While the process of transforming sketches into 3D models was, perhaps a bit more of a refresher, it seems the more subtle part of adding microplastics was not quite seen or understood, as simple as the thing was. In the end, that made the audience a bit interested. Going through some of the difficulties and what my main focus was. I didn’t feel like giving the presentation as there was more that I wanted to throw in but there’s as much as we can do, considering the time all get.

From the moving ahead, well, there’s actually a list I’ve written down:

– Cover larger swaths of terrain with grass

– Make the bottles spawn in a delayed manner after the narration stops and make it stop before the experience ends, change the colour to blue and place them further away to descend over the horizon

– Ongoing calibration in Unity/Madmappa

On Thursday it’s packing day, on saturday showcase and we’re apparently in charge of building da thin’. As in the dome and setting everything up in just a few hours. I hope the weather will turn nice? In any case, there will be a little more work to be done within Unity, perhaps a few updates but at this point, it’s rather just tweaks, agreeing on how much should the projection mapping cover, carry out final tests and voila, everything should be in place. Contingency is as well, just I don’t have a parsons t-shirt (gotta get one).

In any case, I wrote this blog in advance as I just know how much work in coming and how much I should be doing and it’s very likely delays will happen (now hopefully not anymore 🙂

Dome Deconstruction Timelapse

Week 11 – Post-Play

So, Playtech took place and so I did not appear, as I had another thing due, taking place the entire day at the other end of the river. Not that end, and perhaps not a river but a bay but it’s New York, so who knows what is it? A peninsula? An atol? As ridiculous as it may sound, google maps tells me it’s part of Long Island Sound. In any case, I had to hack-a-thon, for an entirely different course, from 8 till 8 at a navy yard, although before heading out, I obviously prepared a notch and a thon, just a simple contingency plan in case anything was to fail, an email with a set of failsafes to make sure everything runs smoothly even if it were not.

After hacking-a-thon, at some point, I noticed that well, no incoming calls were coming in, no messages really, meanwhile just us working and as I assumed, everything should have gone well, during the presentations I sent a message and I received a message that everything did indeed go well. And then later on I’d receive a warm canvas email, saying a few details. Later on, I just learned that the narration seemed perhaps a bit detached and perhaps the calibration of the projector but overall the Dome projection was received positively.

It would be cool to draw stuff with just simply tapping the dome and picking a pen colour, right?

So perhaps I’ll return a little bit backward and mention a few details that I just left out, due to the onslaught of details that there were. As I could not get many of the interaction methods that worked on google cardboard but could naturally not in a Dome, due to simplifying the interaction and focusing on the immersion and correct response of the audience, some things like: Plastic bottles being inside fish could not be there anymore. As in like an X-ray mode that one of the team members mentioned at some point. Getting that interaction was tricky but it was nice to see it in action. Upon X-Raying the fish, the bottle would be gone and a score would be assigned. But that’s part of the production-phase, especially at an art-school, especially in my case when exploration of capabilities and facilities take priority. Sometimes it works and stays like with dancing man releasing microplastics and sometimes, they don’t.

I should also thank Microsoft at this point, despite we’re living in these crazy times of everyone being surveilled, many not being aware and some not that much caring such as myself, with the addition of thinking to know where the world is headed with technologies, I like to take advantage of what is possible. Having a timeline in Microsoft Photos and quickly navigating back to the weeks and the photos, and screenshots I was taking? Extremely useful and surprisingly seamless. Looking forward to a world map with those attached and filtered!

In any case, this was the time in which we focus on using Madmapper and collaborating in Unity, sharing files, updating the skybox video, adjusting some parts, playing with Madmapper, calibrating the projector, calibrating the Madmapper, testing all, testing again, adjusting a bit more and testing again. Introducing some interesting effects and potential artistically pleasing solutions and getting ready for, well, what’s next.

360 Pre-Beta Dome Timelapse

Week 10: Playtech – Converging

This week was crazy.

From what I heard, Playtech turned out just right but there are still things to work on.

I should also mention at this point that I did try recreating the fish from drawings to 3D models in Rhino in a quite rough way. They didn’t end up imho too bad but still figured I’d mention.

And I had it a bit documented already so I’ll just push here a few emails I’ve done preceding to Playtech:

………………..

image.png

Date: 12th April

No, the skybox is not mine. Qiyao worked on it. I’m just putting together the Unity scene, testing it, experimented and made sure it’s stable (and kinda worked a bit with the projection mapping).

I had to set up a new collab environment, check out ASAP if it works, I’m uploading stuff as I’m writing this email (sorry I could not get it set up earlier). Basically, there a few more steps: enabling collaborate function in Unity and setting the seat options (owner).
So, as you join da team I’ve created for this project, you’ll be able to access it from Unity -> In the Cloud -> Ed Tech tesrt. At this point, you’ll probably have a 1 number of 2 number next to In The Cloud.
image.png
Once you access it, you’ll be able to download the package by going into Ed Tech tesrt (this will start Unity) and then -> Collab (Dropdown menu) -> Hamburger Icon (next to the three people icon) called View History -> Restore/Install newest commit
image.png
In case all of this fails, I’ve uploaded a new .unitypackage on the shared drive folder.
From things to note:
– Qiyao was working on an updated skybox, reflected text, clear sky, perhaps also different colours. To update the skybox you need to:
– Import the video file (just drag and drop into Assets folder, wherever)
– Find on the left-hand side in hierarchy (by searching or scrolling through) test2_2 (Converted)
– Click on it
– In the inspector on the right-hand side Video Player component there is Video Clip (and currently Dome Animation v3)
– All you have to do is drag and drop the new Video file into the box with Dome Animation v3 on it
– The skybox will automatically update, at first it will look black but once you play it, the skybox should run updated
In case you want to mess around with skybox options in detail, the skybox itself can be found in: Materials -> Materials -> Sky; and then you can browse through the shader options like Skybox Cubemap, 6-sided, panoramic etc. You can however, adjust the rotation of the skybox, which can prove useful during the calibration. Once you click on Sky, you’ll have the options visible in the inspector.
From other very important things:
I don’t know if that will happen once you download the project but if you use .unitypackage, it’s very likely your dome projection prefab will not work.
All you have to do, is import from your own hydrodrangeas project the dome projection prefab component, delete the existing one, throw in the new one. You may try doing the same with the existing prefab you import with the Ed Tech tesrt.unitypackage but from what I recall, that would just not work for some reason. If you import it again and add the component to Player in hierarchy, it will work.
I will try to update as much as I can before I go to sleep and perhaps also before the hackathon for Tech Media Democracy starts. But let me know ASAP whether anything doesn’t work, at the latest please text me or anything before 8:30, afterwards I will have very limited availability throughout the entire day.
It’s all a bit wordy but all should be included in case of what if, what may etc.
Post-scriptum:
– You can easily adjust last-minute colours of particular objects, like bottles by changing the material colours for: Plastic, Wieko (you may have to adjust both: Albedo and Emission colors)
– If you see missing scripts or even prefabs, don’t panic, take a look if it runs. I had to carve out GoogleCardboard stuff, so you may come across missing scripts but it works without them
– If you or anyone else hates the raining bottles, just disable ‘vending machine’ object in the hierarchy. The bottles disappear after 60 seconds, the scene runs well on my computer even for longer periods of time but if it doesn’t on yours, just disable this ‘vending machine’. If that doesn’t help, look up also the terrain and disable it. If that doesn’t fix it, idk, disable ‘xbot’ and ‘xbox (1)’ in hierarchy, as the mocap + shader combo releasing particles may be just too heavy
– If you can’t get madmapper configured on time, just use use Game windows in Unity project and adjust the scale slider
image.png
With all this, you should be set and be able to madmap anything you need!
I’ll try to push through any updates once I receive, either updated skybox or sound or other stuff. The base is there, even underwater sounds.
Hope all this helps if needed,
Cheers,
Fuad
……………..
Experimentation
……………..
Recreating drawings into 3D models
……………..

Date: Unsent

……………..

So, there are a few other things that I threw in:

– Microplastics: I introduced a particle system that is released based on motion capture data, the particles themselves are squarical but so tiny that should give an impression of microplastics if we want them there
– Plastic bottles: There is a neverending pile of bottles coming down on the camera atm, which piles up around in camera (and kinda cleans itself up, I might reduce the timer on this)
Soon to be implemented:
– Updated skybox with narrative
Considered:
– I thought of having animated fish spawn continuously and after they get hit by the bottle, they would crash into the environment. Unrealistic but could support the narrative.
Will need to be setup on Saturday:
– Madmapper (this application is way to complex),
– Tripod with the mirror
…………..
image.png
April 11

Hello all,

I just tested the new skybox and it works really well on its own and with 3D objects in the environment.

Funnily enough, there are a few interesting things happening with Unity when I extend the projection (with the skybox at least). I’ll let others take a look and assess by themselves but it’s as if a good workaround if Madmapper did not pull through (I’ll be experimenting with it very soon, better to madmap this).
A few people came to visit and so a few notes:
– I might have to change the water bottles, as they don’t look like water bottles but like germs or meteors (I have those raining down, which I thought could be an interesting thing, especially that the bottles can react with the fish, I could program them to let’s say, once they hit the fish, the fish changes colour to red and it starts to drop (I know that’s contrary to how it works in real-life, could do either way))
– I was asked if anyone will be giving out snacks outside
– I was asked if the Dome will stay around D12, cause apparently it would be a good meditation space (I second the opinion, it’s an amazing space and I’d say even perfect to project 360 videos, I experimented a lot with my Gear 360 and while experiencing these videos on a phone, VR headset or a computer is just inferior to regular videos, having them projected in a dome-environment? I see potential, although naturally I didn’t test it due to time)

Also, just an update note: we are in fact working with 3 colours here:
– Red
– Blue
– Purple
Each reacts with the blue/red differently:
– Red Glass with Red keeps it Red
– Red Glass with Blue makes it Black
– Red Glass with Purple makes it White
– Blue Glass with Red makes it Black
– Blue Glass with Blue makes it Blue
– Blue Glass with Purple makes it Blue
– without glasses, looks, well, as is.
Interestingly, darker shades of purple with blue glass makes it more distinct blue that differentiates with the lighter purple turning blue, which works well with the fish that have darker purple outlines. It may be hard to picture this with just simple text but my point being is, there are many ways to experiment with the colours here and the experience.

The most important part is, that it actually does work in a very interesting way.
For now I need the narrative, probably a second pair of eyes, in case of the assets, I think we’re set. The skybox contrasts a bit too much with the terrain, though but perhaps I can make both of these work with one another.
Oh, also I mentioned already to Anezka, my laptop works in a weird aspect ratio of 3:2, I won’t be here for Playtech since I have to attend Tech Media Democracy Hackathon on Saturday. So however the experience will be madmapped, it will have to be on a separate computer running very likely in 16:9. Any other projector would have this running in such but since it’s a 4K one, it extends my 3K screen exactly as is, as a result, even without the madmapper and due to a Unity-VideoSkybox bug, the dome is almost entirely, fully projected on, well, right now actually, even without the Madmapper 🙂
That’s all for now.
I’ll be here at the Dome on D12 today and tomorrow if anyone would like to pop up.
……..
……..
~8 April
On this day I was observing the construction of the Dome. It took them around 3 hours to get it raised, which I found as incredibly impressive (from what I heard, by BFA students from Space/Materiality Course). Impressive! Afterwards it was just setting up the environment and seeing how it looks like within the Dome, additional work and calibration.
Update 1:
Alpha 360 Timelapse from within the Dome

Update 2: Timelapse Alpha

Week 9 – Prototypin’

Looking back at what happened two weeks ago doesn’t seem like much of a stunt, however, once you try looking back and see all the haze of everything that’s been taking place, you cannot really settle on what really happened on which day and what exactly happened, it’s just this much work, which just squeezes throughout time of which you eventually start losing track. And as I try dealing with a variety of courses and Ed Tech being one of them, dedicating you unscathed concentration on one, second, fifth project, managing, converging ideas and dealing with multiple, sometimes overlapping between projects stakeholders, I feel this all will likely replicate to the working environment for me eventually, which all in all is good.

Oh, now I remember what happened. We went on a trip to LCCS, where we converged the groups: Ocean Odyssey and Blussion and we were presenting the low-fidelity prototypes. I personally, decided to continue working on the Unity prototype in which I tried converging a variety of the ideas that popped up. I had a prototype created in earlier weeks, however it did not quite work as intended and so I discouraged myself from presenting it. Due to my experience with projection mapping, I decided to refrain from providing the user movement if it’s going to be placed in a Dome environment due to just, the experience becoming blurry likely due to projector’s refresh rate. While light travels unbeatably fast, there are some other processes that just doesn’t make the shift of colour within the scene immediate and all sort of stuff can start happening: ghosting, blurring, artifacts etc.

On the other spectrum, from Unity and the Google Cardboard prototype I created running on my Android phone, there were quite a few obstacles to deal with along. I had to install Android Studio to get the SDK package to get it running but even then I remained with other issues, how to recreate the interaction methods in a new scene? What sort of interactions should be in the scene? I chucked into it a counter system, along with basic colour shifting due to gazing on the objects, which in this particular case I kept as just simple cubes. I used Gaia Procedural Environments system to create a new environment but that was one of my first approaches to it, with the documentation being quite unforthcoming I had to experiment a bit and then refine the environment to feel more life-like. And then there were clashes between the plugins that I had to debug and make an attempt at resolving and then the waiting time to get the prototype built, which can take ages and in this case, it still would need a good about hour to get it all up and running.

The effort was relatively rewarding, though, as Gaia comes with some underwater post-processing effects that make it look as if you were actually underwater with the addition of appropriate and according sounds. The interactions worked and I eventually presented the cardboard prototype which didn’t quite look good within the google cardboard, unfortunately. I personally thought of the experience to be rather quite average but instead, I actually received quite a bit of attention and feedback, with one LCCS student remarking the experience as ‘cool’, others pitching in ideas, asking whether this can be done or that.

The entire point of this prototype was to give an idea of what is possible to be created, how the students ideas can be synthesised and to imagine how all of this could look in a Dome, where the entire environment would be rendered onto a 360 environment to be all around you, and to start thinking about how the experience should look like at a more refined stage later on.

I managed to work on the prototype while others did sketches, drawings and continued to discuss on what is needed to get the experience right. I managed to introduce an iteration in which transparent, red cylinders would be dropping from the sky and float down all around whoever would be experiencing it, with the option for them to act and make the cylinders disapear. A direct reference to some of the core features, ideas voiced earlier I believe in Blussion, where people could start cleaning the ocean or polluting it even more (the latter I did not introduce). I also started to tweak the colours so that they appropriately reacted with the 3D glasses which was quite a pain, a bit of foreshadowing of what was to come next.

Here is a demonstration of one of the problems I had, as I would keep on losing the interaction method once I descended below a specific level.

Afterwards, we were scheduling what each of us needed to keep working in the coming days, which assets, what narrative, recordings, all of that etc. And so we continued working.

PS: Also, I had a longer version of the post but wordpress decided to not update the thing and all was lost (note to whoever decided to work on a wordpress environment)

Week 8 – Further On

So, a few updates related. First of all, it seems the LCCS kids were creating the assets and ideating for a VR environment. While a Dome technically could constitute as a virtual environment, people still would be free to walk around and interact with it all, so in the end, this will be a totally different, better, although more complex to address experience.

There seems to be more assets such as creatures and the environment that the kids also wanted to upload and design for. The script was also originally created as a very dynamic shift of scenes, which will totally not work for a projected environment. From my experience with a large-scale projection mapping is, that the scene cannot move too fast, otherwise it will just be a blur, which will make the audience nauseous. It’s a bit of a similar issue with vr, hence there is often limited movement associated with vr games, rather just things being thrown at the player to interact with. It might be different with a projection mapping thrown within a dome-environment, however even in such case I’d foresee the latency to be just simply too high, again, too fast objects in the scene, I’d say will likely just not work well (if they can’t design right IPS monitors and even mine lacks behind, I’d be surprised if they did a better jobs with projectors).

Interaction is required of sorts, the children want some to be embedded, although there is some ambiguity in relation to what the interactions would actually be.

I see that the projects of Blussion and Oceanic Odyssey are very alike, although a few features vary greatly. Based on the email follow-up, it seems these two projects may in the end turn up to be different. Hosting two projects within a dome shouldn’t be too much of an issue, for instance it could be a changing scene over a fixed time, from Odyssey to Blussion. If these two were together incorporated, I’d say the effects showcased in Blussion could be achieved within Unity by messing with shaders. I have some basic knowledge on this (solid workshop led by Justin on a Friday before spring break!)

Another way these two projects could potentially be mixed is the incorporation of some other technologies. For instance, I thought of perhaps including infrared/ultraviolet light throwing devices, potentially costly, that could be only seen with the right, well, glasses. I think however, making the experience as seamless as possible would be preferable, therefore without the need for glasses. Also, synchronising Unity with external devices, although not impossible, could prove harder than expected.

For Unity, I pitched in a few other ideas:

  • For skybox (environment ‘sky’), it would be nice to have like a 360 footage/image from within Hudson River (it works, I used some of my own 360 images)
  • Kinect could be used for introducing interaction between people within the Dome and Unity, although Kinect does get a bit wicked with many people within the scene present
  • Physical sensors could be an idea, although as the dome would likely be large, connecting all of them to, let’s say, an arduino, just the simple setup and probably the need to deal with physics messing with appropriate and effective WYSIWYG concept, likely would just not work seamlessly in long-term (lots of people present around anything does activate murphy’s law in no-time)
  • VR Controllers are still an option, although letting many different people interact with the Dome would be cool and imho preferable
  • One of the potentially easiest way would be to use, paradoxically, a simple webcam from above that would register people via OpenCV and even a simple processing application, that would detect position of people, depending on which, some stuff could happen, even the addition of plastic in the environment and getting it removed, which kinda blends in the polluters field game concept
  • Although I didn’t voice this, I feel that involving any sort of UI in a 360 environment is a bad idea, I feel creating a Dome for projecting 360 environments are primarily to create a compelling, immersive environment. UI just redirects the attention to itself, the game’s rules, something that people will not likely learn in a few minutes even

Other ideas were voiced by others:

  • Sketched animation of the environment, therefore an experience without much or any interaction (I just expressed that this will be tons of work)
  • Simplifying the experience to not contain interaction

Also, a few other things about other projects:

  • Family Feud (Trash Trivia) needs a screen
    • And a program to visualise something on it
      • There was stuff on github recommended during the class
      • I’d say, I’d get this up and running with a simple application in processing as an option but a Wizard of Oz needs to be present for revealing, well, the answers

But also as mentioned during the lecture, all of this is likely doable in many different ways. It’s kinda a cool age we’re living in, potentially anything can be created in any way and in so many elaborate ways, or simple ones too, which are cool or even cooler (if you think about it, evolution, it’s kinda simple how we all arrived as a species where we are, despite that the end product is, well, as elaborate as it gets; (obviously depends on philosophical predisposed beliefs of each individual))

That’s all for now.

EDIT: I found the notes I made from one of the last LCCS trips, I hope my handwriting is legible to others 🙂

One of the Trash Trivia members filled this sheet:

 

Week 7 – Restructuring, Recharging, Analysis

So for this week I decided to well, take a week off, it’s Spring Break. Discover a bit of what Manhattan has to offer, primarily in terms of museums, food, interesting attractions and Trader Joe’s. The beginning of the break was a notch hectic, though, as my mindset was still burning red after the mid-term streak and I had to reorganise in many ways. The break did me a lot of good, although I see what’s about to come, what seems like a step-up from the mid-terms working rigour actually.

Before the Spring Break technically took off, I received a phone call, as apparently I missed a conference call. I was quite behind with reading emails, so nothing really reached me on a normal basis throughout preceding week. But I caught up to speed and managed to produce a few notes about this and that.

Also, I somehow magically cannot find the cluster of notes that I have taken at LCCS. Sadly, as there were plenty of great ideas that the students came up with and myself included. In addition, I could not overcome some difficulties with getting the low-fi prototype to the extent I wanted it, I had a few workarounds in my mind but those would have taken quite some time and effort. I’ll still see if I can push through a late version but what it technically was, well, was a spin-off from the Hydrangeas. I really liked that concept and I just wanted to adapt this to Unity… Until I found out it was kinda actually the same idea. I’ll push an update, perhaps I can get conceptual build uploaded sometime before the next class takes place.

I had a few notes, though.

First, about the Garbage Garry. I just managed to see a (in my eyes) absolutely brilliant animation series Love, Death + Robots, a Netflix anthology produced by David Fincher and Tim Miller, and some of the best animation studios around the world. Once I saw the 9th episode ‘The Dump’, I was like – woa, hold on, that looks like Garbage Garry to me! (For reference for others if they’re interested to see how Garbage Garry’s world could look like)

Second: It seems that I will be in between teams, trying to stitch up technologically stuff for the Dome, primarily but could also be in between Unity and other technologies. Now, while my experience with Unity is ever-increasing, especially since I’m at Parsons, juggling between a variety of technologies might prove a notch tricky. In any case, all three projects are connected in some ways to underwater worlds, in such case the first reference link I established was, well, Subnautica. A quite rather interesting videogame, where most of the time you get to explore rather sci-fi but still dynamic, diverse and extremely rich underwater biomes, where vibrant colours and spectacular fluorescent lighting comes and goes. Rather to serve as inspiration but this is one rather good source of inspiration. From others, Blue Planet I & II look also great, also for reference (these two I never managed to finish, though).

I have proposed instead of a field-game, gamifying the environment, such as bins, as to connecting them to the Dome and visualising how not recycling pollutes the environment. This brings the ideas broached to rather shallower waters but it’s very palpable, one action will generate a reaction. In any case, merging the ideas or some features seem (to me) like the right way to go. There is a reference to this, as at University of Sydney there are a few people who came up with an idea of a gamified trash bin.

…aaaand I guess that is all for now?

Week 6, LCCS vol. 2

And once again we showed up at LCCS. 3 PM sharp, made our way via PATH train all the way to Jersey and then back. In between those two points in time, the time we were at LCCS were highly productive. The structure was relatively straightforward, as we were grouped Parsons/LCCS students to create new/old storyboards on post-it notes to showcase our ideas for, well, stories. It somehow went on a rotational basis and our objective was to merge the stories. As best I tried to do so, I wrote down key points of the stories showcased the students (as at times I was grouped up with a few) and then ultimately it all broke down to a story that one LCCS student came up with that went as a standalone and viral, growing up into Trash Trivia, leading up to which a heated debate sprawled out with not much action taken on my part. This time I was being more proactive but ultimately it came down to being a design facilitator and writing down those ideas. Despite my supporting activity here, other students kind of drove their own, ideas, logos, new students joined up. And ultimately the idea, as sharp as it became with about 6 students involved in this project (rising up from 2 at first I believe), was quite captivating to observe, especially once it was driven to winning the contest by the end of our stay at LCCS.

From the details I have gathered, Trash Trivia is an idea which merges elements from Jeopardy and Family Feud, where the center-point established is the theme of ecology, recycling and being conscious within such subjects, leading towards increased awareness. From what I could gather and figured in my head, well the main ingredients required here would be, well:

Which is kinda nice, as in a way this is a concept entirely doable, from what I see also highly engaging but it needs to keep a certain grit to it, perhaps host prizes, otherwise interest could be lost quite rapidly (surrounding marketing?).

I’ll upload the notes from the stories I transcribed of other students. It’s a bit of a mess as there is a lot but I liked a lot of these stories, as they were quite imaginative, although as these started mixing up with other stories, it was beginning to be more difficult to synthesise all the variety of ideas into just one.

By the end of the week I received a phone call as I was notified, there was a meeting scheduled that I did not attend. I have kind of given up on reading emails, just glancing very briefly at every one and each incoming from all the swarm hitting my gmail shores, especially once I signed up to MFADT newsletter. That, in addition with the rrrreeal sweet mid-term-a-day schedule, kinda did its thing, that I was a bit unreachable for, well, almost an entire week I’d presume. I’m not entirely sure what is my role at this point, from what I’ve gathered I’ll be supporting from the technological part with Unity in particular. I personally really liked the idea of the hydrodrangeas and I sort of created a VR prototype for Monday’s students’ project proposals. It’s just that I was battling hours with Unity, switching between rendering pipelines, reimporting assets and in fact, adopting a custom movement script I have devised with a friend about 2 years back designed primarily for google cardboard VR. Then, also, certain event systems decided to not work in VR for some reason, so I did come up with a prototype that I was ultimately unhappy with. I’ll revise it and perhaps attach it by the end of the day (or early tomorrow with additional supporting material).

The thing with working with Unity in particular but pretty much with any bit more elaborate software, bouncing back and forth from issues related to those can become a nightmare. Whether it’s programming software, rendering lighting, working with Adobe software. One of the designerly pains that just come with the job etiquette.

Week 5 Zoom

This class meeting was very difficult for me personally to attend. First of all, it all seemed that there would be some sort of an apocalypse coming down, swiping through the city as if a blizzard the size of a catastrophy showcased in some video games or movies I have experienced, where everything gets taken down to a rapid halt and nobody moves from home anywhere near anywhere. In the end, it was supposed to be 8 inches of snow, from my mixed-background perspective – that ain’t much but in the end, there wasn’t much of anything really, that did not reopen the school, though and that did not mean that the class meeting would not take place, along with another class I had scheduled for the evening at NYU but was pushed to Columbia.

That wasn’t the end of it as I fell ill, with as I suspected, common symptoms of fever, some sort of a viral infection raised mayhem in my immune system and alike, which grounded me for a bit less than a week but that ain’t taking me down. And so as I treated myself to a large batch of Argentinian tea, I made my way eventually towards Columbia University, where I treated myself to a medium-sized cup of joe at joe’s (coffee) and took a seat at Pulitzer’s Hall, where I put my laptop on my laps and proceeded to download whatever I needed to join in to the conference.

Now, first things first – I never made use of the Zoom conference system. Usually what I would use back in the days would be skype, I even remember years ago when I just discovered that I could do conferences with multiple people, even with cameras and that was a breakthrough, still introduced by remaining Estonian startup Skype (I think it was?). Now, you’d do this with facebook, you’d do this with every single application but as it turns out, there are more of those out there and that’s sort of cool, especially if it works well. And it seemed to do just that, as people would connect even with their mobile on the go. I guess the future is here, although I’d kinda argue it’s technology of yesterday but does the job just fine.

From what I recall (of when I was quite tired, sneezing, coughing and all) I believe the meeting was a series of updates, of us introducing our ideas, analyzing the ideas from LCCS and preparing for what’s to come next, such as the low-fi prototypes coming up soon. Here a quick note, from what I know and was educated, low-fi prototype would be like, sketches, paper prototypes, some of the stuff we were doing weeks back. This time it seems I’m delivering a VR prototype – that’s hi-fi, although very likely low-res, I’ll continue arguing, horizontal prototype. The problem is, I haven’t really worked with VR on mobile for a while, especially for android. There are two workaround I could establish, one to publish the prototype on a website, I know this runs at least on my mobile, it could perhaps even run on iphones in such case (not yet tested). Or just drive the development towards a mobile build and perhaps PC, if required.

The only question is, what exactly am I building? I really like the idea of the dome experience, it reminds me of those experiences created by TeamLab that I’ve seen for instance in Singapore. Except, I found TeamLab’s experiences to be very low-fidelity and I see a multitude of area in which this can be improved. Especially within a dome. Especially with more responsive technology. Having some form of a clear narrative. Perhaps with computer vision, perhaps some other technologies. Some interesting ones were mentioned like conductive ink and others. It feels like there is a lot of different materials, forms and techniques available but the remaining question is, well, what’s the concept? Something to deal with utopias and dystopias? Sustainability? Where we want to be in 50 years from now?

And so the work begins…

Week 4 O’Really 360

So the class got really interesting as we were connected over to LCCS via an Owl 360 broadcast system. This was a very interesting piece of technology, that adjusted the scene to focus on who was currently speaking. It worked quite well most of the time. It seemed the children were a bit all over the place but there were interesting stories to be heard and I did like the novelty in how the classrooms were interconnected.

Simultaneously, we were being introduced to Rumii Virtual Reality classroom system. This piece of technology, was quite intriguing on its own, as I see some potential in such teaching being conducted. To think about it, I never liked just watching video tutorials or doing learning in an non-engaging and here is a system, which in a relatively low-fidelity way can interconnect people across the world, teachers, students, a technology I figured tech drivers at Microsoft would have introduced with their avatars on Xbox and others from Sony with their own virtual worlds or Second Life as it was mentioned during the class. Still, none of those went as far as to get that all rolling in a virtual reality world it would seem (and as far as I know), which has plenty of potential. Especially, if you can import your own 3D models and assets in Unity and throw them in. It’s a very interesting world we’re stepping into now.

For the third part of the class, we have created (from what I recall) sort of a system’s design in which we were to showcase the connections of the story in a cognitive dissonance form, where I suspect part of the loop gets broken. For that, a story of Rick and Morty, a popular animation-based TV Show featuring two main characters were involved in a spin-off of an episode in which Rick is involved with the president of the United States. I decided to re-drive the story-line in the context of plastic, where it comes from and throwing in rickandmortish dialogue lines and contexts that I assumed would be suitable to featured in (the show itself). I used screenshots of the show based on the web and traced them in Illustrator to achieve this particular cartoonish style of the storyboard.

Rick and Morty Storyboards: