<<Up
Contents
West Florida
West Florida
was a
North American
Spanish
colony
that later became an independent
republic
in
1810
and then joined the
United States of America
as part of the state of
Louisiana
the same year.
wikipedia.org
dumped 2003-03-17 with
terodump