garnet13aj
Active Member
Lately I've been having some identity issues. I was born in America and I have a European racial background. I don't identify w/my European roots at all, and consider myself to be wholly American, but I'm not sure what that means. I've been struggling w/these questions:
What does it mean to be an America?
Is there such thing as a distinct American Culture? What is it? Or are we just a huge melting pot of all the world's cultures and don't have anything to claim as our own?
When I try to think of things that are distinctly American I get kind of depressed. Does being American mean eating hot dogs, drinking to get drunk (my spanish teacher tells me this is distinctly American--she's from Spain), being a workaholic, eating larger portions, and having issues concerning our body image. There must be more to our culture, but I can't think of it off the top of my head. Help me out guys, I love this country (most of the time) but I'm not sure why.
I will share one story my german scuba diving guide told me about why he chooses to live in America. He said, if you watch an American family out waterskiing and it is their child's first time this is what will happen. The boat will start and "Jonny" will stand up on his water skis and then fall down in about 2 seconds. His parents will say something like "Good job Johnny! Try it again, you'll get the hang of it" Generally positive reinforcement. He said in Germany if the same situation happened the parents would say to Johnny "I knew you couldn't do it, what should I have expected?" something generally negative in nature. This is obviously just one story, but looking back on my childhood experience I would say it rings true for most people I know including myself.
What does it mean to be an America?
Is there such thing as a distinct American Culture? What is it? Or are we just a huge melting pot of all the world's cultures and don't have anything to claim as our own?
When I try to think of things that are distinctly American I get kind of depressed. Does being American mean eating hot dogs, drinking to get drunk (my spanish teacher tells me this is distinctly American--she's from Spain), being a workaholic, eating larger portions, and having issues concerning our body image. There must be more to our culture, but I can't think of it off the top of my head. Help me out guys, I love this country (most of the time) but I'm not sure why.
I will share one story my german scuba diving guide told me about why he chooses to live in America. He said, if you watch an American family out waterskiing and it is their child's first time this is what will happen. The boat will start and "Jonny" will stand up on his water skis and then fall down in about 2 seconds. His parents will say something like "Good job Johnny! Try it again, you'll get the hang of it" Generally positive reinforcement. He said in Germany if the same situation happened the parents would say to Johnny "I knew you couldn't do it, what should I have expected?" something generally negative in nature. This is obviously just one story, but looking back on my childhood experience I would say it rings true for most people I know including myself.