Member Login

JCal Pro Mini-calendar

May 2021 June 2021 July 2021
Mo Tu We Th Fr Sa Su
Week 22 1 2 3 4 5 6
Week 23 7 8 9 10 11 12 13
Week 24 14 15 16 17 18 19 20
Week 25 21 22 23 24 25 26 27
Week 26 28 29 30

Current Time

The time is 17:57:58.
Teaching Print E-mail

CINACS Study Program


Winter Semester 2010-2011


 Name CINACS-Ringvorlesung
 Type lecture
 Amount of Time 2 hours
 Period of Time Winter term 2010; starting Oct. 18.th 2010: Monday, 14:15-15:45
Location: D-125
 Frequency weekly
 Content Natural cognitiv e sy stems—like humans—prof it f rom combining the input of the diff erent sensory sy stems not only because each modality prov ides
information about dif f erent aspects of the world but also because the different senses can jointly encode particular aspects of ev ents, e.g. the location or meaning of an ev ent. Howev er, the gains of cross-modal integration come at a cost: since each modality uses v ery specif ic representations, information needs to be transf erred into a code that allows the dif f erent senses to interact. Corresponding problems arise in human communication when inf ormation about
one topic is expressed using combinations of dif f erent f ormats such as written
or spoken language and graphics.

In this lecture, we will f ocus on models and methods suitable to realize processes and representations f or cross-modal interactions in artif icial cognitive
systems, i.e. computational sy stems. Af ter introducing the core phenomena of
cross-modal interaction we exemplif y the mono- modal basis of cross-modal
interaction and the current dev elopment of inf ormatics-oriented research in this
field with four topics:

  • Cross modal inf ormation f usion f or a range of non-sensory , i.e. categorial data in the area of speech and language processing, where visual stimuli hav e to be merged with the av ailable acoustic evidence. Among the language-related inf ormation sources certainly lip reading provides one of the major contributions of additional ev idence, but more recently ey ebrow mov ement and its relationship to suprasegmental features of human speech has attracted considerable attention as well.
  • The interaction of representational modalities—as language and maps—in the interdependence to sensory modalities, in particular, to vision, auditory perception and haptics. The computational analy sis of multi-modal documents or dialogues is a prerequisite f or adv anced intelligent inf ormation sy stems as well as for human-computer interaction, in particular human-robot interaction. Furthermore, such computational dev ices can be used in sy stems giv ing assistance to impaired, e.g. blind or v isually impaired, or deaf people.

  • Multimodal memory play s an important role f or the next generation of mobile robots and serv ice robots. Using grounded memories of robot actions— use real-world v isual, audio and tactile data collected by the robot—instead of solely a sensorimotor controller, the robot's memory can be enriched and thus robustness of both representations and the retriev al process of autonomous agents will increase.
  • Neural architectures f or multiple modalities. The brain play s the central role in all animal or human behav iour. The integration of v arious kinds of sense inf ormation with cognitiv e processing in neural architectures is therefore particularly relev ant. Examples of computational neural architectures are described, f rom spiking neural networks to superv ised and self -organizing artif icial neural networks based on midbrain and cortical brain areas. The f ocus will be on auditory and visual modalities illustrated by some examples of robotic behaviour.
 Leadership Prof . Dr. Christopher Habel;
Prof . Dr. Wolf gang Menzel;
Prof . Dr.Stefan Wermter;
Prof . Dr. Jianwei Zhang


Name
Psychophysical and experimental methods for cognitive neurosciences

Type Seminar
Amount of Time 2 hours
Period of Time Winter term 2010; starting Oct. 21st 2010: Thursdays, 14:15-15:45
Location: Von Melle Park 11; Room 102.
Frequency weekly
Content The focus will be on the typical methods employed in studies on visual
information processing. After successfully completing this course you
should be well prepared for your first steps in this field. Topics
include: Psychophysical methods (signal detection theory, psychometric
methods, fourier analysis), specific statistical methods
(e.g. standard errors in repeated measure designs), and also more
practical issues like stimulus generation and exact timing of stimulus
presentation. The course will also include practical training in the
programming language Matlab which is used widely in the cognitive
neurosciences for stimulus presentation (PsychophysicsToolbox) and
data analysis (SPM for fMRI / EEGLab for EEG).
Target Audience CINACS and Neurodapt PhD students, master and diploma students of the
Psychology department (with focus "cognitive neuroscience")
Leadership V. Franz
Attendance Recommended for PhD students who want to start with their own
experimental projects.

Summer Semester 2010


Name

Neurobiology

Type

lecture

Amount of Time

2 hours

Period of Time

each summer term since 2008

Frequency

weekly

Content

introduction of the basic knowledge of Neurobiology

Target Audience

CINACS PhD students, also open to master students

Leadership

G. Liu

Attendance

recommended


Winter Semester 2009 - 2010


Summer Semester 2009


Name

Chinese Language Course

Type

language course

Amount of Time

2 hours

Period of Time

summer term 2007-2009

Frequency

weekly

Content

basic Chinese language skills and customs

Target Audience

CINACS PhD students

Leadership

Chinese teacher

Attendance

mandatory



Name

Multi-Modality: Interfaces and Documents

Type

lecture

Amount of Time

2 hours

Period of Time

summer term 2009

Frequency

weekly

Content

multi-modal documents, multimodal interaction, sensoric modalities, computer processing of multimodal documents in the framework of information retrieval

Target Audience

CINACS PhD students, also open to diploma and master students

Leadership

C. Habel

Attendance

recommended



Name

Neurobiology

Type

lecture

Amount of Time

2 hours

Period of Time

each summer term since 2008

Frequency

weekly

Content

introduction of the basic knowledge of Neurobiology

Target Audience

CINACS PhD students, also open to master students

Leadership

G. Liu

Attendance

recommended


Winter Semester 2008 - 2009


Summer Semester 2008


Name

Neurobiology

Type

lecture

Amount of Time

2 hours

Period of Time

each summer term since 2008

Frequency

weekly

Content

introduction of the basic knowledge of Neurobiology

Target Audience

CINACS PhD students, also open to master students

Leadership

G. Liu

Attendance

recommended


Name

Chinese Language Course

Type

language course

Amount of Time

2 hours

Period of Time

summer term 2007-2009

Frequency

weekly

Content

basic Chinese language skills and customs

Target Audience

CINACS PhD students

Leadership

Chinese teacher

Attendance

mandatory


Winter Semester 2007 - 2008


Summer Semester 2007 



Name

Cross-modal Interaction in Natural and Artificial Cognitive Systems

Type

CINACS General Lecture (Ringvorlesung)

Amount of Time

2 hours

Period of Time

summer term 2007

Frequency

weekly

Content

cross-modal information fusion, interaction of representational modalities, multi-modal robot memory

Target Audience

CINACS PhD students, also open to diploma and master students

Leadership

C. Habel, W. Menzel, J. Zhang

Attendance

recommended



Name

Chinese Language Course

Type

language course

Amount of Time

2 hours

Period of Time

summer term 2007-2009

Frequency

weekly

Content

basic Chinese language skills and customs

Target Audience

CINACS PhD students

Leadership

Chinese teacher

Attendance

mandatory


CINACS-Ringvorlesung: Cross-modal Interaction in Natural and Artificial Cognitive Systems

Lecturer: Christopher Habel, Wolfgang Menzel, Jianwei Zhang.

Date: Mo. 12:15–13:45h, weekly,

Start: April 2nd, 2007

Location. Informatikum F-534

Natural cognitive systems—as humans—profit from combining the input of the different sensory systems not only because each modality provides information about different aspects of the world but also because the different senses can jointly encode particular aspects of events, e.g. the location or meaning of an event. However, the gains of cross-modal integration come at a cost: since each modality uses very specific representations, information needs to be transferred into a code that allows the different senses to interact. Corresponding problems arise in human communication when information about one topic is expressed using combinations of different formats such as written or spoken language and graphics.
In this lecture, we will focus on models and methods suitable to realize processes and representations for cross-modal interactions in artificial cognitive systems, i.e. computational systems. After introducing in the core phenomena of cross-modal interaction we exemplify the mono- modal basis of cross-modal interaction and the current development of informatics-oriented research in this field with three topics:

  • Cross modal information fusion for a range of non-sensory, i.e. categorial data in the area of speech and language processing, where visual stimuli have to be merged with the available acoustic evidence. Among the language-related information sources certainly lip reading provides one of the major contributions of additional evidence, but more recently eyebrow movement and its relationship to suprasegmental features of human speech has attracted considerable attention as well.

  • The interaction of representational modalities—as language, and maps—in the interdependence to sensory modalities, in particular, to vision, auditory perception and haptics. The computational analysis of multi-modal documents or dialogues is a prerequisite for advanced intelligent information systems as well as for human-computer interaction, in particular human-robot interaction. Furthermore, such computational devices can be used in systems giving assistance to impaired, e.g. blind or visual impaired, or deaf people.

  • Multimodal memory plays an important role for the next generation of mobile robots and service robots. Using grounded memories of robot actions— use real-world visual, audio and tactile data collected by the robot—instead of solely a sensorimotor controller, the robot's memory can be enriched and thus robustness of both representations and retrieval process of autonomous agents will increase.


Winter Semester 2006 - 2007

Domain: Cognitive Neuroscience (Focus)

Date: WS 06/07, Mo 12:15-13:45 Uhr (2SPW, 6 LP), weekly
Start: Mo, October 23rd, 2006
Location: ESA H

Target Group: MSc., Dipl.-Psych. and doctoral students in Psychology, Medicine, Informatics. Basic knowledge in Experimental Psychology and Biological Psychology is needed.

Lecturers: Brigitte Röder, Andreas Engel, Christian Büchel

Topics
The lecture gives an overview of the most important areas of Neuroscience including visual, tactile, multisensory perception, attention, language, executive functions, neuroplasticity, learning and memory, lateralization, emotion and motivation, consciousness.

Summer Semester 2006

  • Lecture: Knowledge Representation

Date: During the CINACS Summer School 2006
Location: F-334/5, MIN (Informatikum), University of Hamburg

Lecturer: Prof. Dr. Christopher Habel

  • Lecture: 3D-Vision

Date: During the CINACS Summer School 2006
Location: F-334/5, MIN (Informatikum), University of Hamburg

Lecturer: Dr. Chen Shengyong

Last Updated ( Monday, 18 October 2010 )