rianagonzalez7p93nfj rianagonzalez7p93nfj 09-06-2018 History contestada How did Florida become a territory of the United States in 1821? Spain sold the territory of Florida to the United States.The United States traded Georgia to Spain for Florida.The United States won the war against Spain and took over the territory.Spain gave the territory to the United States after the First Seminole War