This team is looking for
Known to be indicators of truth in non-verbal communication, eye movements have long been analysed to reveal the truthfulness of a subject's verbal utterances. Using an Eye Tribe camera, eye movements can be tracked to generate data which will be parsed and sonified. At the idea stage, it would be great to get some Python expertise on this, to be able to use the streamed data in real time to work with Max MSP (which I'm ok with) and sonify the eyes immediately for the user. Or come with your own ideas for how to use it....What can eye movement-generated data do???