Tags

, , , , , , , , , ,

From the early stages in human history, it was of paramount importance for every human being to be able to understand the emotional state of its contemporaries as well as to communicate its own feelings to others in order to achieve acceptance and prosperity among the “pack”-skills to witch we refer today as “social” or “emotional” intelligence. Concept of correlating human expressions, movements and emotional states as well as comparing them to the ones found in the animal world was first introduced into modern science by Charles Darwin 1872 in his book “The Expression of Emotions in Man and Animals”. [1] From this point on, research in this field has become a more and more popular trend in science especially with the development of computers and digitization technologies which enable the process of gathering and processing data to be automated.

Digitization of artistic paintings:

An interesting approach to the study of human facial expressions by close examining large quantities of artistic paintings from all periods in history utilizing a face recognition algorithm used in Facebook’s photo-tagging system is given in the abstract “Not Exactly Prima Facie: Understanding the Representation of the Human Through the Analysis of Faces in World Painting”[1]. Analysis of each face consists in creation of a basic features graph and extended features clustering, which enables classification in historical, cultural and geographical sense. This data collection opens the possibility to analyse the result of hundreds of years of artistic fascination with human facial expression, thus giving us a valuable new perspective on historical evolution as well as on cultural and geographical differences in expressing the same emotions.

Example of face graph for two randomly selected paintings. The red crosses indicate what we called face basic features. The blue lines show the different distances in between the features.

Example of face graph for two randomly selected paintings. The red crosses indicate what we called face basic features. The blue lines show the different distances in between the features.

Digitization of physical movement:

Besides facial expressions, another aspect of physical human expression is the mimic, the fashion in which we move, do sports, dance, etc. However, there was no available technology which could support digital analysis of human motion. Until now! As described in the abstract “Computer Identification of Movement in 2D and 3D Data”[2], the ARTeFACT project, funded by an NEH Digital Start-up Grant and an ACLS Digital Innovation Fellowship tackles the old problem with new strategies. Moves are reduced to the set of small standard patterns “steps”, which formulate the “grammar” for motion detection. Consequently, we deal with much smaller database of “steps”, but, as experiments show, accurately detecting steps and reconstructing the move instead of detecting a whole move by a simple correlation is, unfortunately, done in practice with less success. Infinite database of every possible move is not acceptable so another strategy was introduced based on “conceptual metaphor” which takes into account the fact that humans do not detect the movement as the machine, but as a subjective interpretation. Introducing this model, we no longer deal with simple detection of movement but “semantic descriptions of human motion”[2]. Again, we converge to truly understanding the nature of human expression through motion in order to define sets of rules and concepts witch drive this model.

Figure 2

A sustained note at 440 Hz in three environments used for live coding audio:(a) Impromptu, (b) ixi lang, and (c) Pure Data (see Sorensen, 2013; Bausola et al., 2013; and Puckette 2013, respectively).

Digital sound production:

After analyzing facial and motion expressions, in order to close the loop, audio expression remains. Recording voice and sound is “old news”. However, there is one new and very different method for creative expression using digital sound production, with special, often user made, programs, called “Musical live coding”, described in the abstract  “Live Coding  Music: Self-Expression through innovation”[3]. During the performance, musician produces sound by real-time coding. In addition to this, these programs are often written or heavily modified by the musician himself because the basic concept of real time coding is not only playing the music but creating it. Consequently the process of creating the personalized sound production environment is a form of creative expression itself.

Conclusion:

One main idea binds the modern approach to all of these traditional concepts. Digitization! In the modern world where all kinds of digitization technologies are widely available, we gain a new perspective on all aspects of our lives. Consequently, utilization of digitization for human expression and its analysis has become a worldwide trend.

Reference:

  1.  Not Exactly Prima Facie: Understanding the Representation of the Human Through the Analysis of Faces in World Painting  ( http://dh2013.unl.edu/abstracts/ab-206.html)
  2. Computer Identification of Movement in 2D and 3D Data” (http://dh2013.unl.edu/abstracts/ab-239.html)
  3. Live Coding  Music: Self-Expression through innovation (http://dh2013.unl.edu/abstracts/ab-315.html)
Advertisements