Text and writing has always been multimodal. To read such, it requires both sight and hearing – seeing the marks (letters) arranged on a page and understanding the sounds that each represents and so interpreting what the text is saying. Naturally, one aspect is trumping the others – in this case seeing over hearing (in the instance of listening to a speech, hearing would win). But still it is multimodal.
In the same way, everything around us can be interpreted as multimodal. Even purchasing something as simple as a bottle of mineral water requires sight and hearing (as shown above) along with touch and taste, in combination with certain ideologies that we bring to our water-buying. Even an object that has no language contained within it can communicate by its presence – we can understand what it is/was used for by sight, hearing, touch and the idea that it must have been created for something.
We need to understand that multi-modal “texts” are not just to be read – they are to be used. And, so students need to be taught how to use these texts. It is amazing how far we have come, with pictures being used in books and textbooks, not as a mere compliment to the words, but to describe in their own right. There is so much to be gotten out a picture that would never need to be explained in words. And pictures (along with other multi-modal options) communicate – bringing knowledge and understanding that cannot at times be gained from reading words on a page.