2017 |
Morgan, Evan; Webb, Lynda; Goddard, Nigel; Carter, Kate; Webb, Janette Co-Designing Innovations for Energy Saving in Large Organisations Inproceedings Adjunct Proceedings of the International Conference on Designing Interactive Systems (DIS2017), Edinburgh, 2017, ISBN: 9781450349918. BibTeX | Tags: behaviour change, Co-design, living lab., sustainability @inproceedings{Morgan2017, title = {Co-Designing Innovations for Energy Saving in Large Organisations}, author = { Evan Morgan and Lynda Webb and Nigel Goddard and Kate Carter and Janette Webb}, isbn = {9781450349918}, year = {2017}, date = {2017-01-01}, booktitle = {Adjunct Proceedings of the International Conference on Designing Interactive Systems (DIS2017)}, address = {Edinburgh}, keywords = {behaviour change, Co-design, living lab., sustainability}, pubstate = {published}, tppubtype = {inproceedings} } |
2016 |
Webb, Lynda; Morgan, Evan; Carter, Kate; Webb, Janette; Goddard, Nigel A Living Lab Co-Creational Approach To Energy Demand Reduction in Non-Domestic Buildings: Understanding the Organisation Inproceedings BEHAVE 2016 - 4th European Conference on Behaviour and Energy Efficiency, Coimbra, 2016. Links | BibTeX | Tags: behaviour, co-creation, energy efficiency, living labs, organisations @inproceedings{Webb2016, title = {A Living Lab Co-Creational Approach To Energy Demand Reduction in Non-Domestic Buildings: Understanding the Organisation}, author = { Lynda Webb and Evan Morgan and Kate Carter and Janette Webb and Nigel Goddard}, url = {http://www.research.ed.ac.uk/portal/files/29525980/webb2016.pdf}, year = {2016}, date = {2016-01-01}, booktitle = {BEHAVE 2016 - 4th European Conference on Behaviour and Energy Efficiency}, address = {Coimbra}, keywords = {behaviour, co-creation, energy efficiency, living labs, organisations}, pubstate = {published}, tppubtype = {inproceedings} } |
2015 |
Morgan, Evan; Gunes, Hatice; Bryan-Kinns, Nick Using affective and behavioural sensors to explore aspects of collaborative music making Journal Article International Journal of Human-Computer Studies, 82 , pp. 31–47, 2015, ISSN: 10715819. Abstract | Links | BibTeX | Tags: Affect, Collaboration, Creativity, Improvisation, Music, Psychophysiology @article{Morgan2015, title = {Using affective and behavioural sensors to explore aspects of collaborative music making}, author = { Evan Morgan and Hatice Gunes and Nick Bryan-Kinns}, url = {http://linkinghub.elsevier.com/retrieve/pii/S1071581915000853}, doi = {10.1016/j.ijhcs.2015.05.002}, issn = {10715819}, year = {2015}, date = {2015-10-01}, journal = {International Journal of Human-Computer Studies}, volume = {82}, pages = {31--47}, publisher = {Elsevier}, abstract = {textcopyright 2015 Elsevier Ltd. All rights reserved. Our research considers the role that new technologies could play in supporting emotional and non-verbal interactions between musicians during co-present music making. To gain a better understanding of the underlying affective and communicative processes that occur during such interactions, we carried out an exploratory study where we collected self-report and continuous behavioural and physiological measures from pairs of improvising drummers. Our analyses revealed interesting relationships between creative decisions and changes in heart rate. Self-reported measures of creativity, engagement, and energy were correlated with body motion; whilst EEG beta-band activity was correlated with self-reported positivity and leadership. Regarding co-visibility, lack of visual contact between musicians had a negative influence on self reported creativity. The number of glances between musicians was positively correlated with rhythmic synchrony, and the average length of glances was correlated with self-reported boredom. Our results indicate that ECG, motion, and glance measurements could be particularly suitable for the investigation of collaborative music making.}, keywords = {Affect, Collaboration, Creativity, Improvisation, Music, Psychophysiology}, pubstate = {published}, tppubtype = {article} } textcopyright 2015 Elsevier Ltd. All rights reserved. Our research considers the role that new technologies could play in supporting emotional and non-verbal interactions between musicians during co-present music making. To gain a better understanding of the underlying affective and communicative processes that occur during such interactions, we carried out an exploratory study where we collected self-report and continuous behavioural and physiological measures from pairs of improvising drummers. Our analyses revealed interesting relationships between creative decisions and changes in heart rate. Self-reported measures of creativity, engagement, and energy were correlated with body motion; whilst EEG beta-band activity was correlated with self-reported positivity and leadership. Regarding co-visibility, lack of visual contact between musicians had a negative influence on self reported creativity. The number of glances between musicians was positively correlated with rhythmic synchrony, and the average length of glances was correlated with self-reported boredom. Our results indicate that ECG, motion, and glance measurements could be particularly suitable for the investigation of collaborative music making. |
Morgan, Evan; Gunes, Hatice; Bryan-Kinns, Nick The LuminUs: Providing Musicians with Visual Feedback on the Gaze and Body Motion of Their Co-performers Incollection Human-Computer Interaction – INTERACT 2015, 9297 , pp. 47–54, Springer International Publishing, 2015, ISSN: 16113349 03029743. Abstract | Links | BibTeX | Tags: Computer-supported cooperative work, Eye-tracking, Groupware, Musical interaction, Non-verbal communication, Social signals @incollection{Morgan2015a, title = {The LuminUs: Providing Musicians with Visual Feedback on the Gaze and Body Motion of Their Co-performers}, author = { Evan Morgan and Hatice Gunes and Nick Bryan-Kinns}, url = {http://link.springer.com/10.1007/978-3-319-22668-2_4}, doi = {10.1007/978-3-319-22668-2_4}, issn = {16113349 03029743}, year = {2015}, date = {2015-01-01}, booktitle = {Human-Computer Interaction – INTERACT 2015}, volume = {9297}, pages = {47--54}, publisher = {Springer International Publishing}, abstract = {textcopyright IFIP International Federation for Information Processing 2015. This paper describes the LuminUs-a device that we designed in order to explore how new technologies could influence the inter-personal aspects of co-present musical collaborations. The LuminUs uses eye-tracking headsets and small wireless accelerometers to measure the gaze and body motion of each musician. A small light display then provides visual feedback to each musician, based either on the gaze or the body motion of their co-performer. We carried out an experiment with 15 pairs of music students in order to investigate how the LuminUs would influence their musical interactions. Preliminary results suggest that visual feedback provided by the LuminUs led to significantly increased glancing between the two musicians, whilst motion based feedback appeared to lead to a decrease in body motion for both participants.}, keywords = {Computer-supported cooperative work, Eye-tracking, Groupware, Musical interaction, Non-verbal communication, Social signals}, pubstate = {published}, tppubtype = {incollection} } textcopyright IFIP International Federation for Information Processing 2015. This paper describes the LuminUs-a device that we designed in order to explore how new technologies could influence the inter-personal aspects of co-present musical collaborations. The LuminUs uses eye-tracking headsets and small wireless accelerometers to measure the gaze and body motion of each musician. A small light display then provides visual feedback to each musician, based either on the gaze or the body motion of their co-performer. We carried out an experiment with 15 pairs of music students in order to investigate how the LuminUs would influence their musical interactions. Preliminary results suggest that visual feedback provided by the LuminUs led to significantly increased glancing between the two musicians, whilst motion based feedback appeared to lead to a decrease in body motion for both participants. |
2014 |
Morgan, Evan; Gunes, Hatice; Bryan-Kinns, Nick Instrumenting the Interaction : Affective and Psychophysiological Features of Live Collaborative Musical Improvisation Inproceedings 14th International Conference on New Interfaces for Musical Expression (NIME14), London, 2014. BibTeX | Tags: Affect, Creativity, Improvisation, Music, Psychophysiology @inproceedings{Morgan2014, title = {Instrumenting the Interaction : Affective and Psychophysiological Features of Live Collaborative Musical Improvisation}, author = { Evan Morgan and Hatice Gunes and Nick Bryan-Kinns}, year = {2014}, date = {2014-01-01}, booktitle = {14th International Conference on New Interfaces for Musical Expression (NIME14)}, address = {London}, keywords = {Affect, Creativity, Improvisation, Music, Psychophysiology}, pubstate = {published}, tppubtype = {inproceedings} } |
2013 |
Morgan, Evan; Gunes, Hatice; Harris, Dominic Gesturing at Architecture: Experiences and Issues with New Forms of Interaction Inproceedings Proc. of CHI'13 Workshop on Experiencing Interactivity in Public Spaces (EIPS), pp. 117–121, Paris, France, 2013. @inproceedings{Morgan2013, title = {Gesturing at Architecture: Experiences and Issues with New Forms of Interaction}, author = { Evan Morgan and Hatice Gunes and Dominic Harris}, url = {http://www.cs.tut.fi/ihte/EIPS_workshop_CHI13/papers.shtml}, year = {2013}, date = {2013-01-01}, booktitle = {Proc. of CHI'13 Workshop on Experiencing Interactivity in Public Spaces (EIPS)}, pages = {117--121}, address = {Paris, France}, keywords = {}, pubstate = {published}, tppubtype = {inproceedings} } |
Morgan, Evan; Gunes, Hatice; Bryan-Kinns, Nick Measuring affect for the study and enhancement of co-present creative collaboration Inproceedings Proceedings - 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, ACII 2013, pp. 659–664, 2013, ISBN: 9780769550480. Abstract | Links | BibTeX | Tags: Affect, Collaboration, Creativity, Emotion, Music, Physiology, Social signals @inproceedings{Morgan2013c, title = {Measuring affect for the study and enhancement of co-present creative collaboration}, author = { Evan Morgan and Hatice Gunes and Nick Bryan-Kinns}, doi = {10.1109/ACII.2013.115}, isbn = {9780769550480}, year = {2013}, date = {2013-01-01}, booktitle = {Proceedings - 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, ACII 2013}, pages = {659--664}, abstract = {Affective computing research has tended to focus on the recognition of emotional states in individuals, with the intention of enhancing human-computer interaction. In this paper we advocate the need for a shift of attention towards emotional communication between people. To contextualise our views we discuss the ways in which rapid technological advances have impacted society and human psychology over the last decade. By outlining our doctoral research topic, we then highlight how affective computing based research could help us understand and enhance co-present human-human interactions. We are especially interested in studying situations where the interaction is directed towards collaborative creativity, as there is little existing work in this area and we see great potential for real-world applications to stem from our research. textcopyright 2013 IEEE.}, keywords = {Affect, Collaboration, Creativity, Emotion, Music, Physiology, Social signals}, pubstate = {published}, tppubtype = {inproceedings} } Affective computing research has tended to focus on the recognition of emotional states in individuals, with the intention of enhancing human-computer interaction. In this paper we advocate the need for a shift of attention towards emotional communication between people. To contextualise our views we discuss the ways in which rapid technological advances have impacted society and human psychology over the last decade. By outlining our doctoral research topic, we then highlight how affective computing based research could help us understand and enhance co-present human-human interactions. We are especially interested in studying situations where the interaction is directed towards collaborative creativity, as there is little existing work in this area and we see great potential for real-world applications to stem from our research. textcopyright 2013 IEEE. |
Morgan, Evan; Gunes, Hatice Human nonverbal behaviour understanding in the wild for new media art Journal Article Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 8212 LNCS , pp. 27–39, 2013, ISSN: 03029743. Abstract | Links | BibTeX | Tags: affective behaviour understanding in the wild, gestural interaction, mood, New media art, nonverbal behaviour @article{Morgan2013b, title = {Human nonverbal behaviour understanding in the wild for new media art}, author = { Evan Morgan and Hatice Gunes}, doi = {10.1007/978-3-319-02714-2_3}, issn = {03029743}, year = {2013}, date = {2013-01-01}, journal = {Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)}, volume = {8212 LNCS}, pages = {27--39}, abstract = {Over the course of the London 2012 Olympics a large public installation took place in Central London. Its premise was to enable members of the public to express themselves by controlling the lights around the rim of the London Eye. The installation's design and development was undertaken as a collaborative project between an interactive arts studio and researchers in the field of affective and behavioural computing. Over 800 people participated, taking control of the lights using their heart rates and hand gestures. This paper approaches nonverbal and affective behaviour understanding for new media art as a case study, and reports the design of this installation and the subsequent analysis of over one million frames of physiological and motion capture data. In doing so it sheds light on how the intersection of affective and behavioural computing and new media art could be beneficial to both researchers and artists. textcopyright 2013 Springer International Publishing.}, keywords = {affective behaviour understanding in the wild, gestural interaction, mood, New media art, nonverbal behaviour}, pubstate = {published}, tppubtype = {article} } Over the course of the London 2012 Olympics a large public installation took place in Central London. Its premise was to enable members of the public to express themselves by controlling the lights around the rim of the London Eye. The installation's design and development was undertaken as a collaborative project between an interactive arts studio and researchers in the field of affective and behavioural computing. Over 800 people participated, taking control of the lights using their heart rates and hand gestures. This paper approaches nonverbal and affective behaviour understanding for new media art as a case study, and reports the design of this installation and the subsequent analysis of over one million frames of physiological and motion capture data. In doing so it sheds light on how the intersection of affective and behavioural computing and new media art could be beneficial to both researchers and artists. textcopyright 2013 Springer International Publishing. |
Publications
2017 |
Co-Designing Innovations for Energy Saving in Large Organisations Inproceedings Adjunct Proceedings of the International Conference on Designing Interactive Systems (DIS2017), Edinburgh, 2017, ISBN: 9781450349918. |
2016 |
A Living Lab Co-Creational Approach To Energy Demand Reduction in Non-Domestic Buildings: Understanding the Organisation Inproceedings BEHAVE 2016 - 4th European Conference on Behaviour and Energy Efficiency, Coimbra, 2016. |
2015 |
Using affective and behavioural sensors to explore aspects of collaborative music making Journal Article International Journal of Human-Computer Studies, 82 , pp. 31–47, 2015, ISSN: 10715819. |
The LuminUs: Providing Musicians with Visual Feedback on the Gaze and Body Motion of Their Co-performers Incollection Human-Computer Interaction – INTERACT 2015, 9297 , pp. 47–54, Springer International Publishing, 2015, ISSN: 16113349 03029743. |
2014 |
Instrumenting the Interaction : Affective and Psychophysiological Features of Live Collaborative Musical Improvisation Inproceedings 14th International Conference on New Interfaces for Musical Expression (NIME14), London, 2014. |
2013 |
Gesturing at Architecture: Experiences and Issues with New Forms of Interaction Inproceedings Proc. of CHI'13 Workshop on Experiencing Interactivity in Public Spaces (EIPS), pp. 117–121, Paris, France, 2013. |
Measuring affect for the study and enhancement of co-present creative collaboration Inproceedings Proceedings - 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, ACII 2013, pp. 659–664, 2013, ISBN: 9780769550480. |
Human nonverbal behaviour understanding in the wild for new media art Journal Article Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 8212 LNCS , pp. 27–39, 2013, ISSN: 03029743. |