Christians since the Roman Empire have believed that surrounding culture is important for the preservation of the faith, though there has always been a solid minority which has held it is irrelevant. I think the former idea is closer to the truth but fear the day that the latter opinion is no longer firmly, even fanatically espoused among us.
I will hold all posts for a few days. I would like my readers, a diverse and - really - quite brilliant bunch, to give their thoughts on this large issue. Some of you have your own sites, and I encourage you to link to same in the comments. But this is the sort of overview question that changes slowly in our lives, but has large effects down the road.