I am before that I would like to talk a little bit about our program free stretch project.
Our main interest in mobiles was that it comes from this research project that is being run in January.
It's called the notion of participating and so we will try to add in this project.
Particularly we would like to create more electronic engagement of users in the track part.
And in this context we specifically focus on mobiles in time cases as tools to create new forms of music.
I also include new forms of interactions as well.
And we are a bunch of artists and researchers in our art and designs in the university.
And we would like to give you this idea of interacting in more aesthetically engaging forms.
The implementation of these forms of interactions in this research project is based on a modular structure.
So each part of the research will create and bring its own model and intent that will be a tool to assist them.
That will make us give us the possibility to achieve our goal to create a better social experience of music.
So in this figure actually briefly explains what will happen in the end.
There will be participants who talk to the novices in these mobile devices.
They will be able to do the music together and they will hopefully enjoy it.
And currently we are going to be exploring and focusing more on interfaces and modules.
Our mobile interfaces so we will be focusing on them as tools to be a musical instrument.
And in relation to that we are also exploring alternative weapon strategies.
And before the initial design phase, we had some sort of ideas but we still wanted to ask people
and know their expectations on using mobile devices as musical instruments.
So our aim was to get and know preliminary expectations on sonic interact,
sonic characteristics of such devices.
And moreover we wanted to know how the participants of these experiments would like to sketch
or define the sounds that they would like these devices to make in use of certain tasks.
And we conducted 14 user inquiries sessions with very diverse set of participants.
There were all novices by means of using mobile phones in making music.
And I would like to show a very small video clip.
Back. Back, back, back.
It's nice for you to have this feeling that our first sounds are here. I don't know if you can.
Well, it was right. That was the moment of common gestures that we observed.
And in this experiment we had two different scenarios.
The second scenario included musical gestures. So we asked participants to perform free-form improvisation
by using mobile phones.
So they were just for audience and they made their free improvisation.
And we can easily state that these sound action expectations with mobile devices in use of this sort of context
already begins to provide certain gestures that can be linked with certain type of sounds.
What we observed most expressed in this experiment was they were shaking, squeezing, pointing to a direction,
circular movements, lifting up, rolling us, scratching at time.
And the sounds that they described or sketched and made were mostly percussive sounds.
Shaking, immediately lifting, percussive sounds.
Or when they lifted up, they had continuous pitches out.
Or discrete pitch sounds, they pointed in the right direction.
The nature of these mobile devices made the difference for a form of interaction
where interaction does employ a dynamic hand or arm movements.
So this has been one of the important points in our design decisions.
Besides that, we also wanted the design interfaces that would enable participants to focus more on their interactions
with other participants rather than having a continuous visual of the interface.
So that brings another design direction.
We have been focusing on more ice-free styles of interactions.
So you're going to keep your ice away from the interface and you're going to focus more on interaction itself.
Which they have to decide that the main action to control this device will be tilt
to change the current state of any control parameter.
However, this does not mean that we don't eliminate the touch gesture at all.
At the end of today, this is a hand-held device and it's a graspable device.
And the touch gesture has been done from the beginning.
And mobile phones have been on the market so that the touch was the first gesture
that came with the importance of this type of device.
We also focus touch as one of the appropriate gestures in these devices.
Related to touch, we have been exploring alternative mapping strategies.
So we have been using this two-dimensional fight surface
and we wanted to experiment with what kind of other strategies that we can,
mapping strategies we can get out of instead of one-to-one method.
So the structure of our four-point control layer
is actually based on positioning four points on the touch surface
and with the possibility to change the coordinate values based on certain events.
So what happens? This is a space shadow abstraction.
I will also show you that it works in 900.
So once we set the position of four points,
you can set it by other ways but we wanted to do this with great sort of,
we can then sort the position as well.
This module continuously checks the vector position of the touch input
and then it compares with the vector position of four points.
And outputs the closest points ID and the distance.
And based on that, we also have a weighted average volume.
Thanks to Tim, we have been developing his original action
so that these four points can be mapped to multiple parameters.
But because we are using weighted average module as well,
so that the closest one to this touch input gets more weight compared to other points.
So demo.
Before demo, I would like to talk a little bit about how we compile N900
and not your data in N900 or KF1. N900 comes with an operation system called MIMO,
which is a Linux distribution.
So that's why we also have another project in Nokia Research Center.
So we wanted to try what we can, if we can really easily compile pure, pure data.
We just take the source of PD-Vanilla and then run it very straight forward.
After we have been necessary libraries at TCRPK, then we just compile
and it is like a normal pure data that interacts with your desktop or laptop issues.
Probably had all your architecture N900,
so we had to bypass all of you and be committed to all of your device directly with us.
Hello?
We just always ask you about the word for a camera.
So if somebody...
I don't understand.
I think you know.
Yeah.
I don't know.
I guess I don't know the name.
I guess so.
But I just turn...
I'm running a terminal and be a root.
And then PD starts.
So you can see that it's the main PD window.
So you can see that basic PD structure is there.
But you can see that we can also compile all the necessary objects that we need.
In case we do...
This is a green object.
And also expression and all the calculations we made that we necessary.
And the problem is not that much open to getting access to this sensitive data.
So what happens?
It's like a direct connection so that you need to read a system file that continues to...
The device is set right there.
For example, I'm going to call this.
It makes it a little slower.
But that is very much the problem.
So this is like a giant green object.
So...
I'm changing the...
These four points actually connect the magnitude of the vibration of my object.
I can show you that.
So which will be and can be used as a root way of giving an effective feedback.
I don't know what to do with this number.
So you can change the...
Basically how much is going to vary.
But we cannot change the duration at the moment.
So it has to vibrate a certain amount by giving the value zero.
It can be in the cut vibration level.
Another thing is even though we are focusing on ice cream style interaction.
There is a small LED display here.
Which gives you some visual feedback based on...
Normal use of mobile phone records.
If you just missed a call or something.
So it gives you like a continuous visual feedback.
We can also upset to that one.
I'm not sure if it will be that much visual feedback.
So this can also be used for alternative way to give a visual feedback to participants.
So still they were to keep their eyes awake.
Remember we need to focus on continuous.
But with this small LED display can be used for that.
And quickly with the conclusion.
There are a variety of possibilities to investigate the sound insulation.
With these mobile devices.
And there is IGP community.
They already did an experience with that.
We will also observe the way that they are.
And how they are actually using it.
There is a good possibility to develop this 4-point control idea.
Especially with these two-dimensional surfaces.
So it can give you a good possibility to have a kind of various strategies.
We found that the abstractions in N900 were very helpful.
Especially understanding these expressive gestures.
So what do they really mean by lifting up.
And how the sound action has been coupled.
On the other hand.
The recent changes in Nokia.
Actually they did not really give us a very good news.
So the operation system.
A MIMO.
Which actually latest version Miko.
They were not going to develop it further.
Even they were going to.
Recently they were going to release another phone.
And mine.
Which is going to write Miko.
Which is.
As I was told.
It's a better version of MIMO.
But after that.
That will be.
Because they already got great windows.
They were going to have windows on the operation system.
From there.
I think after a new year.
They were going to have a different mobile series.
So it is.
Therefore it's our attention to.
Intention to integrate.
Especially talking with other possible mobile operating systems.
Who will be most likely.
Another candidate.
Thank you.
Just one comment.
I think it's important to shift from the system.
Because.
You want many people to be able to use it.
To be.
Yeah.
And then the question.
You have four phones.
And you write four.
Three.
Five.
Six.
Yeah.
But it can be.
Definitely.
Any number.
Yes.
The idea was that.
The touch position.
Is actually calculated with four points.
And four points.
Dispatch the.
The frame of this.
So.
And that's fixed.
If you don't change these points.
So that we always get the same coordinates.
Kind of.
Background.
The idea is to actually change the.
Frame of this.
So that we create another border.
Inside the.
Yes.
Kind of.
Yes.
Like a board.
And.
I mean.
Dimensional.
The.
Face.
But obviously.
Can be.
After an.
Point.
Sorry.
Sorry.
Not more.
No.
Right.
I have.
That's what.
Okay.
Any other questions.
Okay.
Well, thank you.
But.
We have the coffee break.
And.
We.
Shall.
Start.
In schedule again.
I would say.
I.
And.
