2018-June-20

Rodolphe Bourote has shared four (4) short UPISketch tutorial videos demonstrating basic functionality of the application.

Tutorial Video 1 – UPISketch Hands on

Tutorial Video 2 – Moving objects

Tutorial Video 3 – Navigation

Tutorial Video 4 – Save, Export and Load


2017-October-24

Rodolphe Bourotte has shared a UPISketch development report.

Experiencing for the first time the relation between sound and visual representation.
A great feeling of freedom with this form of interaction, leaving behind the traditional tracks paradigm from classical Digital Audio Workstations.
Next step will be to allow for free drawing of the pitch on the basis of a selected sound.


2017-September-20

Rodolphe Bourotte has shared a UPISketch development report.

Since the first target is a touchscreen device, the navigation will be simplified with the use of two-finger touches for zooming. We can draw details at any scale and have the playhead updating the page at current zoom level. Finger touches are monitored for debugging purpose.

This is running on a Iphone5s with screen sharing activated, so not very responsive.

Sorry, no sound output yet!


2017 – July – 24

Rodolphe Bourotte has shared the UPISketch Development Report.

 

The GUI (Graphic User Interface) is in progress: the vectorization algorithm is applied to drawing gestures.
(Still, this has to be improved in order to enable easier modifications with less control points)
As of now, some buttons (which will look nicer in the future) do what we intend them to:
we can pan, zoom, draw, clear.

The enclosing rectangles will later be shown only for the selected curves.

The curves are given boundaries in time and pitch, given by the coordinates of the enclosing rectangles relatively to the page parameters.

Each curve possesses three different internal representations: the digitized gesture, the processed control points, and the recalculated curve from the set of control points.

Next step will be to consider these first primary curves as polycurves: the primary curve giving the pitch, and at least one secondary curve, attached to the same polycurve, giving the amplitude.

And providing a user interface solution for this.


2017 – June- 30

Rodolphe Bourotte has shared the UPISketch Development Report.

 

As you may know from a preceding post, a first step of development was achieved in April / May.
We hired a developer, Sean Soraghan, who was previously working at ROLI, to build this first draft.
It was a great opportunity to get something well crafted, respecting the JUCE standards of programmation.
(JUCE is the C++ framework we are using for UPISketch).
For those who may want to know, the utility of a framework for the programmer is that he doesn’t have to reinvent the wheel for each and every subfunction the application needs (and it needs a whole lot of them! ).

So far, so good. Next, I had to run several tests with the delivered app, and dive into these numerous pages of code to get a better understanding of how it works (after half a day of real-life debriefing with Sean in London).
To allow us to go ahead, we needed to get a proof of concept for the analysis / synthesis part.
We still have to tune the algorithms so that it will do a better job, but we hope to keep on with this idea.

Below is a screenshot/video of the milestone1: we load a sound, analyze its pitch, and change its pitch contour:

Concurrently with Sean, I have investigated several objects / functions of the JUCE framework, essential to our application:

Among other things, a test application has been made, which combines XmlElement, ValueTree, FileTreeComponent, TextEditor.
All those elements are essential for being able to manage file writing as XML, and allow undo on a set of operations.

Below is a video taken from an Iphone5:

I am now working on the definition of a second milestone, which will consist in a basic GUI for drawing sounds on a page (first, the pitch then, other parameters! ).
This definition is a catalogue of all the objects we need to create (menu items, drawing areas, tools…) and a description on how they interact. To this aim, I explored the world of lambda functions, extensively used in the milestone1. I hope I’ll be able to follow this nice way of implementing callbacks between objects!

Talking about interaction, in some of the next posts, we’ll probably mention the zooming feature which, while broadly used in numerous apps, is far from trivial!