We have this crazy situation here that says that if you wish to work in certain jobs or at certain companies, or even if you want to work in some professions that you MUST join the union that represents workers in that job. If you don't join the union, you can't work there. Unions don't affect me in my profession but they have affected other people around me, including my children (their teachers must belong to their union in order to teach). So what does everyone think of unions? Have you had good experiences with them? Bad experiences with them? Do you feel they are necessary?