Bilinguals, people who simultaneously know and use two or more languages, are an interesting source of clues for discovering the internal make-up of our language system. Specifically, it is interesting how bilinguals are able to reliably access the right words in the right language without making mistakes, even though languages contain significant amounts of overlap in terms of semantics, orthography and phonology. In computational psycholinguistics, we model phenomena such as word retrieval via computer models. Despite the fact that we do not have access to the actual word store embedded in our mind, modeling can provide us with clues as to how it is organized, more particularly, by constructing models that can simulate key findings in psycholinguistic experiments. Having said that, current models for bilingual word reading can account for most of the facts, but largely underspecify a crucial component of our day-to-day word retrieval: meaning. Moreover, and related to this shortcoming, most models of word access have only modeled words in isolation. In reality, however, words are always embedded in sentences and larger linguistic and non-linguistic contexts, which also influence the way we access our words. By creating models of sentence processing, we can make sure that meaning has a more central role in our models, and thereby give new explanations for several phenomena in bilingual word processing.