Deprecated: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in /nfs/c01/h06/mnt/10927/domains/ : runtime-created function on line 3

Deprecated: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in /nfs/c01/h06/mnt/10927/domains/ : runtime-created function on line 3
Scalable social interactions
DragonPrime - LoGD Resource Community
Welcome Guest
  • Good morning, Guest.
    Please log in, or register.
  • September 17, 2019, 02:11:27 AM
Home Forums News Downloads Login Register Advanced Search
* * *
DragonPrime Menu
Resource Pages

Pages: [1]   Go Down
Author Topic: Scalable social interactions  (Read 3253 times)
0 Members and 1 Guest are viewing this topic.
mr deathrain
« on: November 08, 2005, 01:36:26 PM »

I've help develop a major online social networks and a huge concern for us was making the ongoing social support scalable. As the base reached millions of users, how does a company continue to enforce certain virtual community standards without having to hire hundreds of customer service people?

It seems LOGD suffers from similar problems. The question is, what kind of features can be added to address the problem in an automated way?

1. Detect and stop bots
Everyone hates captchas; however, they are a necessary evil to stop bots. Lotgd has a solid system. It's forgiving, etc. Very important for new accounts.

2. Points of interaction - segregation
 - Identify where users interact.
 - Look for ways to segregate the different types of social standard groups (e.g. conservative vs liberal) in these areas.

The system already has a few of these things in place. The newbies learn on the Isle of Wen is a good example.
Note: We've seen that most users tend to self-segregate if the system supports it. (For example, someone is willing to see adult language posts so they mark it in their profile. And then when they post, there's a checkbox labeling said post to "adult language"... or just having areas that show up for adult language).

3. Voting
Some way that users can vote to mute or ban someone is tricky. The haunting is a nice idea, but maybe it can be taken farther?
The must be some cost limitation to votes.
The result of votes should limit an offending players ability to interact for a period of time. (For example, no PvPs, no posting, loosing a turn day, etc.)

4. Avoid aggressive punishing
If a player isn't posting anything, they should be able to be punished for posting something offensive. (It seems obvious, but hey...). For example, clicking on a users name next to some posted text might lead to a "report inappropriate comment" page.
Reporting inappropriate postings might cause it to be filtered.

5. Explainability
Users need to understand why they are punished and how to avoid it in the future.

6. Cops
Eventually, there are situations where it's just too difficult to automate things. This is where "cops" can be handy. Police should meet certain criteria: hours played, account existing for certain number of years, etc.

This is an interesting area to develop. How to you keep the cops ethical? Maybe cops are segregated dealing only with one type of offence.

What to do?
Well, it can be argued that this is an interesting problem to solve; however, it does mean a lot of work and tweaking. It's almost a game within the game.

(My best wishes to the developers of LotGD and whatever their next ventures might be.)
Captain of the Guard
Offline Offline

Posts: 190

I don't mind, so what.

View Profile
« Reply #1 on: November 08, 2005, 02:11:51 PM »


an interesting posting...

I think the main concepts of what you are describing here are already included in the core of this game.

Bot stopper, points of interaction, cops/moderators, explainability
all this and more is already in the core *sing* Smiley

There is also the possibility to report strange behaviours of al flawors with a petition to the admins.

And it should be pretty easy for an experienced modder to create a module like the Karma Module of this site to allow the users to vote for a specific user.

To be honest I'm always skeptik if there is to much moderation somewhere I think the LoGD community is a special one and there is no need to overregulate the users..of course I can be wrong...
someone with a bigger server can make a much more valuable statement then I can... but thats the way I see it.



p.s. can you please explan to me what you ment with point 4 -> If a player isn't posting anything, they should be able to be punished for posting something offensive. ?
mr deathrain
« Reply #2 on: November 08, 2005, 03:01:12 PM »

Thanks for the reply.

Based on some of the reasons SaucyWench gave for stopping development (I think one was the amount of CS work related to Lotgd), I assumed there wasn't a scalable way for the system to deal with issues that didn't require a human to be involved. I admit that I don't know much about the how well the current approach is scaling as more users join.

Yeah, #4 is a bit confusing.
The way of reporting abuse shouldn't allow users to free-flow type in a username and cause. This approach can encourage users banding together in vigilante efforts against others even if the offending player hasn't logged into the game or posted content for days. (This seems obvious, but it's a detail that worth stating).

The act of heaving reporting can hide the offending content to put an upper limit on how much total reporting happens.
Mod God
Offline Offline

Posts: 5484

View Profile WWW
« Reply #3 on: November 08, 2005, 03:43:15 PM »

A lot of the commentary checks are a function of the server's staff.  In the Central server, a suitably sized cadre of moderators, admins and helpers guided the players to stay within certain guidelines of conduct.  The workload was fairly reasonably distributed, so no automated or player-input type of moderation code was really necessary.

What does become difficult is the managing of server policies, modules, balance, staff (training, assisting, finding new members) and dealing with the inevitable petitions for which human intervention is an absolute must.

A module could probably be created fairly easily which would allow karma +/-, with suitable checks to prevent 'ganging up', or bullying...but it won't be done as part of the core code. (I won't be doing it either, so if someone wants to, feel free).

Incidently, I notice you created an account but haven't logged into it for these posts.  Did you receive your password?

Play the latest beta version here on DragonPrime
« Reply #4 on: November 10, 2005, 02:37:28 PM »

No problem with my account... I was just being lazy. Thanks for checking though.
Pages: [1]   Go Up
Jump to:  

DragonPrime Notices
Play LoGD on Dragonprime

Support Us
No funds raised yet this year
Your help is greatly appreciated!
Recent Topics
DragonPrime LoGD
Who's Online
74 Guests, 0 Users
Home Forums News Downloads Login Register Advanced Search