Index  This week  Schedule & abstracts  Past talks  Organizer notes  Graduate page 
Date  Speaker  Title 

Organizational Meeting  
We're seeking out speakers for this year's Graduate Student Colloquium.
 
Slumdog Millionaire  
Srinivasa Ramanujan was an Indian mathematician who at times lived in poverty
and, as paper was scarce, only recorded his final calculations. He filled up
four 'notebooks' with line after line of results which launched whole new
areas of math along with the careers of those who tried to verify the
beautiful identities they found. We'll give a brief account of Ramanujan's
life and of selected results.
 
Adventures in Calculus  
I will present several experiments that I conducted while teaching our own
Math 124 in 23 days this summer. I will discuss which were successes, which
were failures, and which didn't work but still have potential. Faculty,
applied math students, undergraduates, and anybody else who is interested is
encouraged to attend.
 
Gravitational solitons  
In 1978 Belinskii and Zakharov developed a method (the Inverse Scattering Transform)
for generating new solutions of Einstein's field equation from known solutions.
These new solutions are gravitational solitons, namely, gravitational waves of the
space continuum that are localized and maintain their shape even after interacting
with other solitons. The method reveals a beautiful connection between the
singularities of certain complex valued functions and the physical properties of the
gravitational waves.
 
A Simple Spectral Counterexample  
The spectra of operators, such as Hamiltonian operators in quantum mechanics,
are in general difficult to find explicitly. Fortunately, for many problems there is no
need to look at the whole spectrum and one can simply use certain trace invariants to
solve these problems or find approximate solutions. However, these invariants do not
retain all of the information about the spectrum.
 
Entropy and Huffman Codes  
Information theory was invented by Claude Shannon in 1948 and gives us
fundamental limits on how we can store and communicate data. This talk will
be an introduction to two basic ideas in the field. Entropy is a number
which quantifies the amount of uncertainty involved in predicting a random
variable. Huffman coding is a method of lossless data compression which is
optimal in some sense to be revealed in the talk.
So come along and see a fundamental limit of how we can store data and then a
clever idea that tells us how to get as close to that limit as possible. I
won't use any mathematical tools beyond precalculus but I will prove a few
small facts. Hopefully you will enjoy the proofs as much as I do.
 
Title To be determined  
Abstract to be determined
 
Title To be determined  
Abstract to be determined
 
Title To be determined  
Abstract to be determined
 
Title To be determined  
Abstract to be determined
 
Title To be determined  
Abstract to be determined
 
Title To be determined  
Abstract to be determined
 
Title To be determined  
Abstract to be determined
 
Title To be determined  
Abstract to be determined
 
Title To be determined  
Abstract to be determined
 
Title To be determined  
Abstract to be determined

This page is maintained by Shane Passon
email me if you have questions or comments