Replication, confirmation and falsification are important
concepts in science, yet it is well known that publishing mere
replications - or even just studies with no "novelty value" - is
surprisingly difficult. Many scientists would admit this is a
problem, but preciously few actually publish their experimental code
and software. Maybe some people feel anyone can replicate an
experiment given a proper "Methods" section in a journal - but as a
reviewer, I often find these lacking important details. Be that as
it may, I have, through the years, tried to dissemminate code as
well: to entertain (some people actually pay to do "braintraining"),
to replicate (or expand) or to inform (or learn, say, E-Prime). A number of experiments is provided for the Psychology Software Tools
called E-Prime platform, which I co-authored a book
about (The E-Primer), and some people have found a use for taking my
experiments apart and seeing how they could do something similar. Do
let me know if you find a similar good use! However, I will not take
responsibility should your computer (or participant) explode.
More likely, the experiments should work fine with a bit of
tweaking, since they tend to be early versions that aren't
release-worthy and haven't been tested on different platforms. So,
both if you run into a problem, or don't and just happen to be in an
appreciative mood, please email me.
Virtual Mirror. A tracking experiment designed for digital tablet and pen. It should also work for the mouse! See Spapé & Serrien (2010) in the publications section. Update: some people seem to like the experiment as it shows what can be done with E-Prime and a tablet. I now updated it to E-Prime 2 and added a little mirroring feature (like Serrien & Spape, 2009, Neuroscience Letters). Let me know if you found it useful.
Updating Conflict. This is one in a series of experiments designed to test whether conflict (a Simon effect in this case) is affected by moving the box in which it appears in. The short answer is: yes (paper submitted).
Tracking an Object Across Perception and Action. This is what you get when trying to combine the original Hommel (1998) Event Coding paradigm with Kahneman et al.""s (1994) Object Reviewing paradigm. It shows proof of independent feature conjunctions: rotating the boxes only the location-shape (as in the reviewing study) and the location-response (new) bindings, whilst keeping the shape-response association intact (see Spapé & Hommel, in press).
Attentional Blink. Based on Elkan Akyüreks dissertation. Typically produces strong lag-1 sparing and attentional blink, but low recovery. See: Colzato, Spapé, Pannebakker & Hommel (2007) and Colzato, Slagter, Spapé & Hommel (2008).
Facilitative vs negative priming experiment. Based on Huber et al. (2002). Repeating a target and/or foil in a two-alternatives forced-choice procedure produces strong facilitative priming effects with short prime-durations, but a 'preference' against the target with longer prime-durations.
Vibro-tactile feature integration. Like the tracking an object experiment above, but concerning intermodal feature-integration (see Zmigrod, Spapé & Hommel, in press). Also shows how you can use a standard XBOX 360 controller as a cheap vibro-tactile display. After installing the setup.exe program on computer 1 (with the .NET 2 framework and the XBOX 360 drivers installed - see Microsoft downloads), connect it via a serial cable to computer 2 (with the E-Prime experiment installed) and voilà!
|Cheesepsy is my latest experiment, designed to test motor control and handedness in children from 4 years on (but 5/6 year olds find it easier). It runs on any computer with .NET 4 installed and is compatible with various MIDI devices. Here, we used a DJ controller; rotating the left wheel causes the left mouse to move, and the right wheel, the right mouse. The subjects are asked to keep following the cheese with the mice (and the mouse shows happy animations and does squeaky sounds if the participant succeeds).|
|The Self in Conflict was designed as a virtual, mediated version of one of my previous studies (Spapé Á Hommel, 2008), but with the Simon task and avatars. A kinect was used to capture the motions of participants, who then looked at a virtual version of themselves, imitating their movements. It wasn't actually coded by myself (but by Imtiaj Ahmed), but in the spirit of open access, we published the experimental platform along with the new publication.|
These experiments were mainly made to gain insight into the interplay between (multiple) object tracking on the one hand, and perception Á action feature-integration on the other. They are meant as pilots of strange new paradigms involving motion tracking, or demonstrations to give people some idea of ""what I do"". If you (e.g. a Master student) are interested in collaborating on one of these projects or interested in having the source code, please drop me an e-mail (in the upper-right corner).
|Continuous Objects in Motion and Action (English). This continuous tracking tasks requires participants to press various buttons when a letter flashes in an object. Mostly, this lets me study object-based attention in a more fluid way than the usual trial-by-trial paradigm allows. Also, it may allow answering questions such as ""how many objects can be tracked both in action and perception"" and, ""what affects the aging of feature bindings"". Be warned: at this moment, the COMA task currently goes on forever, unless you rudely wake the participant. That is, please save your data at any time, or when you have had enough of it.|
|The Binding Arkanoid presents some kind of "binding game". The idea is that this task requires participants to both track a target as it moves across a playing board, and to be able to cope with its various abrupt changes. A theory based on integration and retrieval would predict a few types of errors are much more likely than others, which itself is modulated by task-relevance. Italian version thanks to Cristiano Cellini.|
|Morphing Memory is a demonstration of the sequence modulation of the Simon effect which itself is modulated by feature-integration. Please see E-Prime scripts or my poster which better explains this. I also made a Dutch version of this experiment.|
|Insight and Matchsticks was based on the work of Knoblich et al. (1999) who were interested in processes involved in insight-problem solving. This Flash version was somewhat based on Michael Öllinger's Java program. Subjects were asked to correct the equation by (mentally) moving one of the matchsticks. Following, they could enter the solution using the point and click interface. I believe it should be fairly easy to edit the code to this experiment, so you're welcome to download it here. (update, 2014: I corrected the link to the actual SWF (flash) file since the present server does not support PHP. This works, but no data will be saved in this demo - use the php in the zip file instead).|
A few macros I wrote for Vision Analyzer here.
|A MATLAB function to convert continuous ECG data to IBI here. This sounded ridiculously easy when I started coding, but because of recording problems (not-quite-sticky electrodes, reversed polarity), it turns out it's more difficult than it looks. The code takes an EEGLAB struct, tries to differentiate between "real" heartbeats and secondary effects, interpolates missing beats, and returns a useful EEGLAB struct with a continuous IBI (or BPM) variable. Image to the right shows it in action on 20 s of data, with upper graph raw ECG and vertical bars showing detected beats and the lower graph showing the output in IBI (ms).|