Submit a Definitions
Submit a definition! Help us build out database.
Get a Term Defined
Request a definition! Submit a term you need defined.
American Art Definition
American art generally refers to art hailing from the North American colonies and of the United States. Although the term can denote a wide variety of styles of and methods of art, perhaps the most recognizable and iconic American art from is the painting of realistic portraits and landscapes.
Added By:
laws-admin
Category:
Art