Lesson 10 - Hybrid interactive scenarios
Lesson 10:
Transcript
00:08 - Lesson 10. Hybrid interactive scenarios. In the last tutorial of this series I’m going to show you how we can integrate the data coming from the sensor we used up to now, the accelerometer, with the features available inside TouchOSC. The goal is to control our instrument by mapping it to the accelerometer’s data and to a layout of TouchOSC simultaneously. Differently from the other tutorials, I won’t guide you step by step in the creation of a patch, I will rather introduce you to a patch ready to be used, the mapping logic behind it, and some new abstractions I created to handle control messages. 00:49 - When you open “lesson-10.pd” at first glance this patch might look familiar to you. Do you recognize it? This is the patch which we implemented for the “amplitude modulation” synthesis in tutorial number 6. But there are some edits. First, beside the “amplitude modulation” algorithm, I also sketched the “ring modulation”. Although the implementation might look quite similar, the result is fairly different. So now you have one patch which performs two different kinds of synthesis, so teachers and students are now free to switch between “AM” and “RM” synthesis. Furthermore I also added a volume control. 01:32 - As you can see the part of the patch in charge of receiving the packages of data sent through TouchOSC is clearly split from the part that generates the sound, with the first one on the right side. In order to do this I had to get rid of all the patch cords and substitute them with a new pair of objects: “send” and “receive”. 01:54 - As the name suggests, “send” takes a message and sends it wirelessly to one or more “receive” object(s) sharing the same name of the send object. Sharing the same name for “send” and “receive” is crucial, otherwise no communication between the two can happen. 02:13 - As you can see both objects can also be shortened using “s” for “send” and “r” for receive. 02:24 - In this circumstance the use of these two objects allows us to create a cleaner and easier to understand environment. I suggest you check out their helpfiles as well. 02:40 - Since the mapping of this patch is a little more sophisticated than the previous one, I also had to develop some new abstractions to properly route different messages coming from the accelerometer and from TouchOSC GUI, namely: “accGuiSwitch”, “switch2”, “spigot2” and “selector (tilde)”. The first one allows us to switch between a synthesis controlled by an accelerometer’s data or by the interaction on a TouchOSC layout. This will be clearer once I show you the mapping behind it. 03:26 - “switch2” allows us to select which data flow is sent out between two incoming ones. “spigot2” is similar to the native “spigot” object but with this abstraction we can route data to two different outlets and not only gate it. 03:44 - Please check the helpfile of “spigot” as well. “selector (tilde)” is similar to “switch2” but meant to route audio signals instead of messages. I won’t go into the detail showing you how these abstractions work inside, therefore I kindly suggest you to explore them in the case you want to grasp their logic. They’re pretty easy to understand. 04:11 - Let’s talk about the mapping. Go to TouchOSC . . . you should now be looking at the “simple” layout, which looks like this. If you select the third rectangle on the top you should switch to this new window. 04:30 - In order to control our patch we are going to use this big yellow frame, named “xy” and 3 of these 4 toggles. So what is the mapping strategy behind them? The idea is that you can control the synthesis engine, by using accelerometer data and this GUI at once, but how? We used accelerometer x and y values, discarding z, the depth, to control the “modulator” frequency and the global volume simultaneously. 05:06 - But we would also like to be able to use “amplitude” or “ring” modulation and this is the point where this “simple” layout we are looking at right now steps in. The second toggle, this one, allows us to select one of the two synthesis algorithms. When it is set to 0 like this, you are playing with “AM”, if it is engaged like this you are playing with “RM”. 05:38 - The first toggle allows us to stop and play the soundfile we loaded inside the patch. 05:45 - This becomes evident if you look at the “select” object connected to this outlet. 05:52 - If select gets a 0 it triggers the 0 message sent to “readsf”, if select gets a 1 the “open” message containing the path of the soundfile is triggered. If you don’t remeber “readsf” syntax please check its helpfile and tutorial number 6. 06:14 - The last toggle is where the magic happens! If this one is engaged the accelerometer’s data are bypassed, therefore moving the smartphone won’t affect the sound any longer. Instead the sound becomes controlled by this bi-dimensional slider. 06:35 - Let’s load a soundfile into our patch and edit the path here to create a loop. 07:03 - These will be the only operations you need to perform on the machine; all the other ones can be controlled through your smartphone. Let’s try it for a moment. 07:15 - As you can hear the x axis of our slider, similarly to the accelerometer’s data, is used to control the “modulator” frequency; and the y axis for the volume. 07:42 - Let’s change the type of synthesis we want to control, by selecting the second toggle. You can see the numbers changing in this “numberbox” here. As you can hear the result changes considerably. Let’s now stop the sound with the first toggle. Let’s disengage the last toggle and start the playback again in this way. Now we are controlling again our patch using accelerometer data. 08:36 - As the very last thing I would like to point out that even the “layouts” provided by TouchOSC are customizable, so you can easily create and load your own control interfaces. Albeit it’s quite straightforward, this is not a topic of this series, so I suggest you find one of the many tutorials available online. So we have come to the end of this series. With these tutorials I wanted to show you that everyone can learn the basics of a visual programming environment such as Pure Data, and to see its enormous educational potential. Starting from scratch you are now able to take control and understand patches like this of a medium complexity and this is undoubtedly a big achievement. 09:17 - You are free to use the patches we developed together and the material I provided you in different ways and contexts. If your students are advanced enough you could teach them the basics of a programming language such as this and the logical thought behind it, following these tutorials’ framework and setting specific learning goals. Otherwise you could arrange new patches or using the ones that are ready to be used to foster the creativity and the desire of interaction of your younger students within different workshop scenarios. Keep exploring and having fun with Pure Data!
Example Patch: