Gamify Poverty

How it would work isn’t clear, but it’s an intriguing idea: handle complex social problems by turning them into games, with the stakeholders as players and the remote goals left alone in favor of incremental achievement. It could work for many different social and political problems, but today I will focus on poverty as an example.

When someone is impoverished, they might sign up for foodstamps and other forms of welfare like Medicaid and housing assistance. The Republicans want to add more and more work requirements, strings attached to try to help them escape poverty in order to receive assistance. As though the problem of poverty is as simple as “get a job.” And then, of course, you have to balance assistance against wages earned to try to keep people from being better off staying poor than working a dead-end job, and it’s quite a messy problem.

One of the key features of gamifying is to offer both group achievement opportunities and individual achievement opportunities. Some respond better to one than the other, so having both is important. Thus, a person in need of assistance would either be assigned or would choose and join a group of others (likely at a mixture of stages along the route away from poverty) and would work with them on certain tasks.

They would also have individual tasks, with the possibility of individual achievement (and thus reinforcement). Because part of the problem with the current model of poverty-assistance is the fact that people can do the right thing and go wholly unrewarded, heightening the chance they will fail (i.e., an unreinforced positive behavior tends to be extinguished).

Bootstrapping the current welfare model, for instance, a person receiving $1 of assistance by default should receive $1.10 if they look for job opportunities, $1.20 if they research particular opportunities, $1.30 if they fill out an application, $1.40 if they go to an interview, etc. Instead of work-fare, it should be game-fare.

Similarly, for public housing, there should be some amount of discretionary spending allocated, which residents can use in their group (either a floor or a building or whatever pod-size makes sense) for improvements. The data from these events can be used to evaluate the sorts of improvements that will benefit other public housing areas, as well as provide the residents with experience in making improvements for when they have their own homes.

All of this can go atop some sort of score-keeping system, so that the assisted can track their progress, finding ways to improve their scores that also help them escape poverty. By formalizing it into a game, you get a ton of data that can further improve the game and you get a plethora of reinforcement opportunities where the assisted can see their actions resulting in some tangible gain, even if only in game points.

This same model could be used in many other areas, such as Veteran reintegration, prison reform, helping people with medical problems (“the diabetes control game”), and so on.

The biggest obstacles to this sort of reform are probably:

  1. Calling something a game may be misperceived to be making light of a serious situation.
  2. Politicians are preternaturally opposed to good ideas.

Congratulations for Reading This!

Participation awards. People have a problem with participation awards. The argument is that awards that do not signify an achievement these busybodies find significant weakens their own self-esteem to the point where they no longer feel special when they get their tenth sandwich free at whatever hole in the wall they thought they were competing in America’s Got Eating a Lot or whatever.

The truth is, psychology is against them. The reality of our lives is against them. Yet they feel they have a beef. Awards for little league seem to be their primary target. Story goes, a little kid, or even up to young adult played a sport. They didn’t send their opponents home in a St. John’s Ambulance, so they don’t deserve to be “rewarded” (because a molded-plastic trophy is something other than a mere souvenir) for “just showing up.”

I’m 100% certain that these people making a federal case out of participation awards do not celebrate their birthdays. Birthdays are the ultimate participation award. You didn’t die for a year. Have a cake and we’ll sing and here’s a gift card to that sandwich joint you keep bragging about getting your holes punched at, you bastard. No way. Birthdays? Please. Do something big, for a whole year, like hopping only on one foot the whole year, or go a year without using the word “ball,” including compounds thereof, then we’ll talk. Then you can have your fucking cake.

Same goes for that tenth-sandwich-free thing. Eating regularly isn’t an achievement. It’s a privilege. There are starving people in this world, and merely being lucky enough to not be one of them does not constitute an achievement (unless you did something like invent farming, maybe). If you’re so concerned about false awards, refuse that freebie.

These same people, when they go on vacation, they won’t buy souvenirs. Going to a museum does not qualify as achievement, so they can’t buy the reminder. Oh, wait. In that case, it’s commerce. So if their child buys himself the plastic twerking trophy, they’re cool with it? What if their child steals it? Isn’t successful theft an accomplishment of sorts? What if all the other teams in the AAAA…A ball league, that they somehow believe might qualify their little rascals for being called up to the big leagues, suddenly get chicken pox at the same time? The team wins by default? Is the trophy somehow earned then? What if the munchkins arranged for the whole league to get the pox? Does that count?

Anyway, back to psychology. If you want kids active, and many kids aren’t, rewarding them for being active is good. That’s not to say those trophies are really about reward. That’s a conceit in the mind of the parents. The trophy is so they can look back and remember how many sports they tried and maybe remember how much they grew as fucking human beings. The horror! It reminds them they can participate at all. That even if they weren’t the best, the world needs support roles, and sometimes the support classes make the difference, even if the parents of the quarterback think they shit gold.

In psychology, if you want a behavior to sustain and increase, you reinforce it. If you want it to cease, you withhold reinforcement. If you take away kids’ silly trophies, and if those trophies were rewarding to them, you may just find they stop participating. If they don’t care about the trophies, you saved a few bucks on trophies, and you reduced the amount of plastic waste in circulation.

Turn Left onto Memetic Circle

So the other important issue here is the so-called Putrification (Putrefaction) of America. Sometimes called the Pussification or Pussefaction. It’s a form of the Politically Correct meme, though purveyors of this version believe they are firmly anti-PCness.

I’m 90% sure this anti-participation-award crap came out of the mind of someone on the lookout for pussefying. The neighborhood pussy watch or something. Wait. The pussy posse. There you go.

But the point is, this idea that there is a definite trend in some vein, and then you can find the fucking pattern. The frequency illusion. You know, you find a dollar on the street and then suddenly you’re finding dollars everywhere. Except with a word, or a concept, or whatever. Not with money. Never with money. Damn you, Baader-Meinhof Phenomenon!

The difference in this case, the pussefaction-believer wants confirmation. They aren’t looking at all the ways America is not being changed, or being changed in other ways. They only look for, only see things that confirm.

In closing, a list of things that participation award haters also cannot do:

  • Read this post, as it would make them winners of the “I read that” award that all readers of this article are awarded
  • Have funerals unless they died in some really cool and/or unique way, as, like birthdays, funerals should be reserved for the badasses only
  • Get married, unless they rescued their spouse-to-be or otherwise achieved something that deserves it, like haggling for a really great price on the venue, cake, flowers, or so on
  • Take advantage of income tax deductions or credits
  • Partake of the post-orgy group-cuddling (unless they orgasmed more than the average number of orgasms)
  • Celebrate Christmas or Easter or really any holy day, unless they have done something to earn it, dammit (I’m sure the Easter Egg Hunts they put on for their kids include spikepits and other hazards)
  • Same for Independence Day: unless you fought in the Revolutionary War, no fireworks, hotdogs, or s’mores for you!
  • Drink cocktails that have garnishes (self-explanatory)
  • Give their kids tooth-fairy money
  • Wear any sports paraphernalia from any team they were not an integral part of (this includes political signage, etc.)
  • Accept the “I Voted” sticker, unless you did a really kick-ass job of voting, I guess

And so on. Let’s call these freeloaders out! Always keeping the extra soda that accidentally fell out of the soda machine, like they earned it. Always parking in an empty parking space without even having to drive around for hours, like it’s their birthright.

But seriously, there are a ton of no-effort trophies in life, and it’s a bit silly to only clamp down on the ones given to children and young adults while ignoring all the others. If you want to be macho, grow a mustache. If you want your children to achieve something, help them by giving them opportunities to fail and the freedom to choose what to try.

Anyway. Enough words wasted on that silly idea. Congratulations for reading this post. You did it! I hope you will keep reading things (written by me or not) in the future! Good job, winner of the “I read that” award (no monetary value, offer void where prohibited, bestowal of award may not be used as evidence that you actually read anything, etc.)! For a copy of your award, print this article out, and scribble your name on it and have it framed or gilded or something cool. You can even hold yourself a little ceremony where you give it to yourself and make a speech and drink champagne. Then later, you can hold an ethics hearing into the allegations you didn’t really read this article, and you can have a committee decide whether you can keep the award. Then you can go on an apology tour where you try to reassure everyone concerned that you messed up, okay, you get it, but you have changed, in your heart, which is the place where change is the most meaningful, and you deserve a second chance. You don’t have to, but you could do all of that. It would be quite an achievement if you did.


Knowing the Greatest Bug Number

Most open-source projects have a bug tracker, and bugs are numbered sequentially. Thus, knowing the highest existing bug number (GBN) is an important piece of information if you’re looking for what you believe is a new (globally, not just new to you) bug.

You do a search for bugs, and one of the things you see is their number. If you know roughly what the GBN is, you can tell at a glance whether the bugs in your search results are new or old. That will give you important information, as a bug from ten years ago, from a deprecated version that won’t even run with your current libraries, etc. is probably not your bug.

Of course, technically knowing the GBN isn’t sufficient. You must also have some vague idea of the reporting cadence. That is, if GNB is 600K, and you’re looking at bug 550K, the rough time to accumulate the intervening 50K bugs could be large or small. And the cadence changes, probably following an elliptical orbit (speeding up close to release, slowing down at the midpoint between releases).

The question is, would it make sense to change how bugs are numbered? Would it make more sense to have bugs numbered like: YYYYMM-NNN? Adding in DD would probably be too fine-grained, but months probably fit well for most projects.

Bug numbers do not have an inherent meaning. They are already security-compromised in terms of giving away the timeframe they were filed. (That can be avoided by squirreling away some numbers and filing them late, but it’s unlikely to be done in open source. Even if it were, you could still squirrel some date-stamped bug numbers.)

You would lose guess-the-date-for-bug-N contests, though you could technically still hold one, it would just be using bug numbers people don’t refer to as often.

You might also gain a benefit: developers would have the date-of-filing staring them in the face every time they referenced a bug. I don’t think that would really matter; software gets built as best it can. Most unfixed bugs aren’t from laziness, but from difficulty, lack of information, and lack of time.

The other side of that is the users who will say, “this bug is five years old.” They already exist, though, so it doesn’t seem like this adds to that problem.

But having a year-month-numbered bug can, in some cases, give an immediate idea of a project’s size. A bug like YYYYMM-1 won’t, but a bug like YYYYMM-2468 will. They’re getting at least thousands of bugs per month.

There might be some technical issues. Will a bug database handle a lookup as easily when fragmented into years and months? Should there be that dash or should it be YYYYMMNN? Will people think the Ns are days?

What other benefits would this scheme achieve? Could you type MMNNN for the current year in a search? Or NNN for the current month? Will people get frustrated when it’s the first day of the month and their bug now needs to be typed out? Could you use some shorthand for that case?

I think using partial dates rather than purely sequential bug numbers (at least as an alias; there’s probably value in keeping the regular, sequential numbers, too) may have some use, but what do you think?


Cult Thinking and Terrorists

Tragic events pain us, and even more so for the failure of media to put them in the proper context. The media fails to educate, to the point they prefer to run with gossip and innuendo to purely educational content to fill dead air.

On some issues they may paint a fair picture, such as when they cover cults. Most of the time the cult harm to society comes in alienation and wasting of resources. The media seldom covers cults unless their harm grows far beyond this basic level, to mass suicide or worse.

But many events we see in the news are intimately related to the sort of cultural relativism needed to understand cults. None more so than terrorism, and the world view that allows for it.

First one should might contrast the reaction to domestic incidents with those that take place overseas. The media tends to barely report terrorist bombings in Iraq, for example. They certainly do not follow any manhunts, seek out family, neighbors, and other acquaintances to interview, and the like.

This itself shows the sort of tribal and cultist worldview. The value difference based purely on nationality or locality becomes essential to terrorism and cults in general. But that value finds itself lacing most any culture.

The feature of the media that stands out as an unanswered question (the media should both ask questions and answer or seek answers to questions): ‘how could terrorists kill the innocent (children, civilians)?’ But worse than media, this sentiment arises from elected officials (which suggests the need for a Constitutional Amendment requiring continuing education for all legislators).

The basic formula of the cult, of terrorists:

  1. The world differs from how you learned to view it (and therefore from how your teachers view it and how their group views it).
  2. There will be calamity unless either most people come to view it correctly.
  3. For peoples’ minds to change, YOU must participate in some activity that you wouldn’t do without our programming.

It’s a little more involved, especially using ego control (using emotional abuse to train the person to become dependent on the cult (and more importantly on fulfillment of their promise) for emotional health), isolation (to prevent opportunities for cognitive dissonance), and other techniques.

The belief that one’s soul hangs upon carrying out a religious/ritualistic promise to the gods, and that not continuing once promised would essentially doom one to hellfire illustrates why many single out religion as a problem. But that can be said equally of any religion that posits the existence of a hell, and pointing to the non-cultist believers as both wrong and faithful simply strengthens the belief.

To understand the act of terror one must unpack the meaning not as it appears to the asker, but to the terrorist or cultist worldview. Ultimately the prevention of terrorism relies upon this sort of thinking. Some measure of terrorist acts may be prevented through law enforcement and military operations. Most terrorism will need to be literally disarmed through cultural actions not violent actions.

But society needs this sort of understanding not just for combating terrorism, but cults, racism, and fascism of all sorts. We need to be taught to unpack our own culture from time to time and recognize the dysfunctional and functional parts. It doesn’t ruin a thing to understand it, yet it seems a part of our culture believes exactly that it does.


Getting Past the Reflexive Response

One of the phenomena we see in discussions of changes to society is a purely reflexive response. We see this both affirmatively and negatively, depending on the way the issue is couched. A guess is that the responses tend to be more negative, that reflexive responses in general tend to be “no.”

New York City, a city in the US state of New York, recently enacted a law against certain food establishments selling soft drinks containing more than 16 fl. ounces (approx. 0.5 liters). Many people had a reflexive response against that move. The belief that both individuals and businesses should have the right to make that sort of decision, rather than government, fired rapidly in the mind. This was followed by the section of the brain containing the term, “nanny state,” jumping up and down, yelling, “me! me! me!”

While I think a reasonable person can disagree with the implementation of the ban, it’s harder to make a case against the idea that people should drink less fizzy sugar water. But let’s set that case aside, and just focus on the reflex.

It seems like the reflex is a combination of the brain having existing wiring for the type of argument and a tendency to take a defensive posture against change. We see the same disposition in many subcultures, including political and religious ones.

In the case of soft drinks, people have encountered dietary arguments for years from vegetarian, vegan, and similar dietary movements. The anti-smoking arguments also follow similar lines. With recent studies showing correlation between social connections and things like weight gain and diet, even the second-hand smoke arguments have a home here.

People also have received reinforcement from something like sipping on a soft drink while having positive social interactions, so much that some may be able to tell you that they enjoyed a particular flavor of drink during a particular interaction (not unlike people remembering specific times with specific types of alcohol).

The notion of giving up something that seemed to add to an experience is threatening. It usually takes several nearby nodes in the network making a change in order to encourage more nodes to change.

Not In My Back Yard, or NIMBY, is another example of a mantric argument that is conjured when a reflexive response occurs. Windmills are often opposed as a reflex.

The notion of job security has paralyzed whole sectors of the economy, as we become afraid to modernize and shift economic focuses because of large blocks of employment. That is, we place employment as a higher importance than the economic functionality that would ensure it.

There are reflexive responses when someone denigrates a prophet, or when a community perceives a travesty of justice, and so on.

How do we get past these reflexes? How do we get sane arguments that don’t run into walls of no-from-the-hip?

My hunch on this is that society, or whatever group seeks to have good arguments, assigns advocates regardless of belief. Just like high school debaters, people can advocate for causes they don’t necessarily believe in. It gives an opportunity for new ideas to prosper in a way that doesn’t stigmatize initial advocates too severely (which risks blanching future dissent, leading to further totalitarianization of a group).

Likewise, increasing opportunities for interaction and shifting of social links would enable more nodes to recognize opportunities for different behaviors. Although anecdotal (in that I haven’t looked for any research that backs this up), I find it likely that part of the positive impact of World War II on the US economy stemmed from the mixing of all those young people, along with their exposure to diverse social orders across the globe.

At any rate, reflexive responses should be seen for what they are. We shouldn’t let them kill good ideas, but should allow ourselves to entertain the idea without fear that it will consume us. Society needs to learn how to do that.