What's up guys? American TV show became something very present in our lives. A major American cultural diffusion.
So, what's your favorite American TV series?
You prefer detective series such as "CSI" or musicals like "Glee" or rather drama series like "Gossip Girl"? Come on, share me your taste!
I personaly love HIMYM and GoT even if others are really greats!