HOME MISSION AND RESEARCH PUBLICATIONS HISTORY PEOPLE SEMINARS COURSES VIDEO ARCHIVE CONTACT

VS298: Additional resources

From RedwoodCenter

Here is a collection of additional material that you may find useful for this course.

Computational Resources

This course will require basic familiarity with Matlab. Many of the homeworks will contain scripts in Matlab that will help you get started. However, you are strongly encouraged to use the language of your choice as well as experiment with other frameworks.

Here is a list of some recommended options:

Scipy

Scipy is an actively developed open-source scientific software library written in the language Python. Several of its key developers are here at Berkeley, including Jarrod Millman and Fernando Perez. Fernando and Kilian Koepsell organize a weekly meeting at the Redwood Center for any of you interested. Please see the Py4Science group page.

Sage

Sage is an open source alternative to Matlab and Mathematica. It was started by an algebraic number theorist and has grown into an astonishingly powerful and fast system complete with a web-front end and interfaces to many other software systems (include all others in this list). You can even try it online (see above link). Written in Python and Cython.

R

R is an open-source statistics environment popular at Berkeley. It has a large list of contributed packages as well as a large set of statistical plots not available in other packages.

Octave

Octave is free program which aims to be compatible with Matlab.

Matlab

Matlab is possibly the most prevalent commercial scientific software. It has a large community of users in academia, with many publishing code on their websites. It has a strong signal processing toolbox. And it is very easy to learn with a nice graphical front-end. A student version can be purchased at the Scholar's Workstation for around $100, though additional toolkits must be purchased separately.

The program has an extensive help system and tutorials. Most questions you will have regarding Matlab have been already asked and answered on the comp.soft-sys.matlab newsgroup.

Mathematica

Mathematica is a large commercial system for doing mathematics with a clean and powerful unified design. Despite being known for its symbolic prowess, it has highly optimized numerical routines and the ability to inline compile code. Wolfram Research has recently created favorable per-semester pricing for students for it's latest version of the software.

For a sampling of some of the things you can do with Mathematica, check out the Demonstrations Project and Michael Trott's homepage.

Mapping code

There is a lot of help on the web to map from one language to another. Here's a very helpful site with a subset of these side by side.

Additional Textbooks

Biology

  • Kandel, E.R. and Schwartz, J.H. and Jessel, TM. Principles of Neuroscience. McGraw-Hill, 2000.
    • Standard neuroscience text, an encyclopedic collection covering the main topics in neuroscience. Little theory.
  • Koch, C. Biophysics of Computation: Information Processing in Single Neurons. Oxford University Press, 1998.
    • Explores in rich detail experimental and theoretical findings in single neuron biophysics. Compartment models, cable equation.
  • Rieke, F. and Bialek, W. and Warland, D. and Van Steveninck, R.R. Spikes: Exploring the Neural Code. Bradford Book, 1999.
    • Readable account of methods used to characterize how neurons encode stimuli, with detailed mathematical appendix. Information theoretic bias.

Machine learning and neural networks

  • Arbib, M.A. The Handbook of Brain Theory and Neural Networks, 2nd Edition. The MIT Press, 2002.
    • An encylopedic collection of short articles written by leading researchers about anything you could imagine related to neural networks and the brain. Mostly theory and computation.
  • Bishop, C. Pattern Recognition and Machine Learning, Springer, 2007.
    • A well illustrated and up-to-date expository machine learning textbook.
  • Duda, R.O., Hart, P.E. and Stork, D.G. Pattern classification. Wiley, 2001.
    • The updated version of this classic text covers many recent topics in statistical learning theory and neural networks.
  • Hastie, T., R. Tibshirani and Friedman, J.H. The Elements of Statistical Learning. Springer, 2003.
    • Another commonly used text in machine learning.

Probability and computational math

  • Cover, T.M. and Thomas, J.A. Elements of information theory. 2nd edition, Wiley, 2007
    • The standard text on information theory. Complete proofs of standard results. Recently updated.
  • Papoulis, A. Probability, Random Variables and Stochastic Processes. McGraw-Hill, 2002.
    • An excellent, accessible reference for many topics in basic probability.
  • Press, W.H. and Teukolsky, S.A. and Vetterling, W.T. and Flannery, B.P. Numerical recipes in C: the art of scientific computing. Cambridge University Press, 1997. Available online
    • Standard scientific computing text. Clear, brief explanations of theory as well as practical aspects of numerical methods. Recently updated and available only in C++ form.

Related courses at Berkeley

  • TCN: Computational neuroscience journal club
Personal tools