Cinema of the United States

HollywoodAmericanUnited StatesHollywood filmsAmerican filmAmerican cinemaAmerican film industryfilmHollywood studiosfilm industry
The cinema of the United States, often metonymously referred to as Hollywood, has had a profound effect on the film industry in general since the early 20th century.wikipedia
0 Related Articles
No Results Found!