Embedded false truths: When trust works against progress

There’s a sort of conversational thing, especially on the toneless internet, where you don’t want to be talking to someone who really just wants to screw you around or treats you like a bug.

And that’s fair enough. But at the same time the roleplay community tends to find the trustworthy and well meaning, by comparing notes on ‘how things work’. If someone shares that, they are worthy of some trust or are well meaning, whatever.

The thing is, if this notion of ‘how things work’ is actually false, then it’ll never get corrected. Because to question the notion is to undermine the reason the other person trusted you in the first place.

I’ll draw an example from D&D boards, which I hope (in order to make a point) at least my readership don’t share as a notion of ‘how things work’, and that is ‘It’s not really railroading if the players don’t realise it’. Agree with it, and you get trust. Don’t agree and you end up on the outside of something. But if the notion is a false one, then it has become embedded and accepted and can never be challenged because as soon as you challenge it, the other person retracts their trust and ignores the challenge. They say your being ‘combative’. And that’s when they’re being polite.

Though I’ll grant that if you meet with the person in real life, you may develop other reasons to trust them/think them well meaning, like they gave you a beer, or lent you a DVD, or you lent them something and they returned it. This gives some room to latter on question the notion. It will still tear down the trust that was given by it being shared between the two people, unfortunately. But the other things that indicate trust, will remain and all is good.

But on the internet, do we share beer or share DVD’s? Not that often. So on the internet, in roleplay discussions, how often do you have to give the nod to a false notion in order to be able to discuss anything at all/be trusted enough to actually talk with?

And how much does this embed false notions and perpetuate them in online roleplay culture?

Cause, you know, it’s not really railroading if the players don’t realise it……

Would you enjoy agreeing to that notion, just to get enough trust to actually talk with someone? And your own notions that you expect someone else to just agree to or that proves they are untrustworthy – are you perpetuating something just as much?

Advertisements

7 thoughts on “Embedded false truths: When trust works against progress

  1. I hope you don’t mind a little digression and a little anecdote. If you do, skip the following three paragraphs.

    I am an agnostic: I don’t claim to know if there is a god or many gods or spirits or reincarnation or whatever. I used to despise Christianity when I was a kid, but have gotten over it. Regardless, I don’t feel I understand devout Christians. Regardless, many of them are intelligent folk, so there must be something interesting to understand.

    A few days past I talked for an hour or so about Christianity with a devout fundamentalist Christian (fundamentalist in the sense that he considers the Bible to be the infallible word of God; not in any other sense). Pretty early I made it clear that he has little hope of converting, and I have equally little hope of swaying him and that further I had no desire to do so. Right. Then I asked pointed questions and listened to what he was saying and pointed out inconsistencies I perceived and so on. He answered according to what he thought and believed.

    The outcome: I feel personally enriched by the conversation. My stance with regards to agnosticism and the divine remains unchanged, but now I have something of a grasp of how some fundamentalist religious people think. I also learned a little bit of myself, which is always valuable.

    To tie this back to roleplaying, I see the above as comparing notes on how things work. I find it useful to know how people think things work, even if I disagree with them. The reason behind this is, I assume, optimism: If there is a significant amount of people who believe in some thing, there probably is some reason for it, and I’d like to know that reason. Often something can be learned from it, even if I consider the idea as a whole to be irredeemable.

    Further, if I wanted to persuade someone of, say, it really being railroading even if the players are unaware, it would be useful to know where they are coming from. It allows me to argue against what they really believe, as opposed to some strawman.

    Our discussions, for example, are full of poor communication. It makes them difficult and reduces their value, in my opinion.

    Would you enjoy agreeing to that notion, just to get enough trust to actually talk with someone?

    I would not; as with the fundamentalist, I’d say that I disagree but really want to know where they are coming from. If I wanted to persuade them, that would be another discussion.

    Many people have no interest in exploring where they are coming from; that I accept, though I don’t understand it. Some are interested in such discussions; them, I respect.

    And your own notions that you expect someone else to just agree to or that proves they are untrustworthy – are you perpetuating something just as much?

    If you want to attack my position and hope to achieve something, first understand where I am coming from. You don’t need to agree with it, as long as you know what you are talking about.

  2. I wrote this refering to people who want to self correct any errors (errors by their own estimate) they are in in regards to a topic, but didn’t explicitly write this. I’ll keep that in mind for next time. For someone who feels they have no errors to correct in regards to a subject, that’s another subject! 🙂

    If you want to attack my position and hope to achieve something, first understand where I am coming from. You don’t need to agree with it, as long as you know what you are talking about.

    I don’t see any practical value to such requirements?

    Edit: And I mean the practical value of the other person agreeing you understand where they are coming from.

    2nd Edit: Even more so, why would they always know where they are coming from?

    The reason behind this is, I assume, optimism: If there is a significant amount of people who believe in some thing, there probably is some reason for it, and I’d like to know that reason.

    Some reason for it? Some reason for the belief and it being widespread? Or some reason for the focus of the belief?

  3. I don’t see any practical value to such requirements?

    Let us assume that they have some claim you want to disprove. You can attack the claim at three different segments, so to say: Foundations (including their base assumptions, experiences and definitions), reasoning or conclusions. There typically are different reactions to attacks on different segments; some foundations are so deep that attacking them is pointless; the Christian I talked to was reborn; he had an experience where Holy Spirit (or Ghost or whatever it is called) settled in him, causing a warm and fuzzy feeling. I have little hope arguing with that. Likewise, you’d have hard time convincing me that understanding how and why things work is not valuable in and of itself.

    More to the point, if someone defines railroading as “players feeling the GM pushes them around too much”, then, by definition, it is not railroading unless the players notice (and mind). You’d have to say you have a better definition, and here’s why and how it is better.

    You talked about trust in your post. When someone knows that you understand what they are talking about, that creates trust; further, it allows using more specialised jargon and ignoring certain qualifiers, hence making communication more effective. (See, for example, GNS discussions by those familiar with it.)

    People generally don’t know what assumptions they make; some do know, many would benefit from knowing (I think).

    Some reason for it? Some reason for the belief and it being widespread? Or some reason for the focus of the belief?

    All three, I think, though the second one tends to be not as interesting as other factors.

    If a number of people believe something, then it often makes sense in some way, from some perspective. Learning that perspective is, IMO, valuable (for pragmatic reasons and as a base assumption I hold).

  4. As I said, I refer to people who want to self correct any errors.

    As far as I can tell, you can’t disprove something to someone who does not want to self correct errors on the matter. They can always do an equivalent of sticking their fingers in their ears. Only someone who harbours atleast small amount of doubt about their correctness on the subject, can self correct.

    Well, perhaps in the grossly physical plane, they can. They say the boat wont sink, they take it out to sea and suddenly their up to their neck in water…then people tend to accept they were wrong, when they stuck their fingers in their ears before.

    In terms of alot of people believing something, I don’t know why that it makes sense somehow any more than if one person believed it.

  5. Only someone who harbours atleast small amount of doubt about their correctness on the subject, can self correct.

    Yep. There’s no way of finding out if someone is willing to reconsider their beliefs but probing.

    In terms of alot of people believing something, I don’t know why that it makes sense somehow any more than if one person believed it.

    That more people believe something does not (in general) make the something any more true; but I would say that it makes it the something more likely to be true.

    Singe person can easily be utterly wrong about something, but it takes effort to make a sizable number of people to be actively wrong (as opposed to merely uninformed) about something. Like I said, optimism.

  6. I didn’t refer to them reconsidering their beliefs, just correcting what they themselves would think is a conflicting error. If that means a belief is extinquished as a side effect of that correction, that’s still just a side effect rather than the main goal. And I don’t want to find out if they want to, I simply assume they do. And as usual, assuming is usually a mistake. I’m wondering how to self correct on that one.

    In terms of large groups it seems more like a lack of effort is what makes them wrong, rather than more effort. A lack of effort put into scrutiny and simply resting on the laurels of ‘large group == right’. This doesn’t even work in darwinistic terms these days – just because their alive at all doesn’t mean their hypothesis helped them stay alive. Too much infrastructure is there these days (atleast in first world countries) to help the sick, the infirm, and even the deluded.

  7. I didn’t refer to them reconsidering their beliefs, just correcting what they themselves would think is a conflicting error.

    Okay, I’m confused. Want to give an example?

    In terms of large groups it seems more like a lack of effort is what makes them wrong, rather than more effort. A lack of effort put into scrutiny and simply resting on the laurels of ‘large group == right’.

    Lack of effort means that people may be wrong or right, but usually because they don’t know or care about alternatives. To have people be informed about something and still wrong takes work.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s