User Guide
Topic
Overview

Topics are the central elements on nlite. To investigate a controversial question, begin by creating a new topic using the New Topic button in the top navigation bar.

Creation

When creating a topic, you will be asked for the following information:

Title

A brief, descriptive title limited to 120 characters. (Check out this tip on how to select effective topics.)

Description

Use the description to clarify the question and provide any necessary context. The maximum length of this section is 500 characters.

Argument Submission Deadline

The cutoff point for users to submit their arguments. The goal of this deadline is to ensure all good arguments are put forward for evaluation.

Argument Evaluation Deadline

After the Argument Submission Deadline has passed, users can continue to evaluate existing arguments until a second deadline, the Argument Evaluation Deadline. This deadline occurs after some time has passed since the first deadline. The time gap between the two deadlines is important as it allows for the evaluation of all submitted arguments, even those that were submitted late.

Once the Argument Evaluation Deadline passes, the topic is closed, and the topic page will display the final list of top arguments identified for all the viewpoints.

Allowed Number of Viewpoints

The individual who creates the topic has control over the maximum number of viewpoints that can be submitted on the page. This value can be set to 2, 3, or 4 and can be changed later by editing the topic.

Hide argument authors while evaluations are in progress

If this checkbox is checked, the authors of arguments are forcefully hidden by the platform while evaluations are in progress. This feature helps users focus on the content of arguments rather than their authors.

Notably, after the Argument Evaluation Deadline has passed, this input loses its effect, and the visibility of the authors of arguments and counters will merely depend on the choice made by them. See the Anonymity section for more information.

Anonymous

If this checkbox is checked, the identity of the topic creator will not be disclosed. Of note, the anonymity status can be changed at any time.

Editing and Deleting

The user who creates a topic can always edit its fields or delete it completely. To do this, they need to click on the three dots next to the topic title and select Edit or Delete.

Viewpoint
Overview

Viewpoints represent different perspectives on topics. They can be added using the Submit Viewpoint button located below the topic description.

Submission

When submitting a viewpoint, you will be prompted for the following information:

Viewpoint

A short phrase describing a perspective on the topic. Its maximum length is 120 characters.

Anonymous

This is a checkbox. If checked, the platform does not disclose the identity of the individual who submits the viewpoint. The anonymity status can be changed at any time.

Editing and Deleting

The user who submits a viewpoint can edit its fields or delete it completely. To do this, they need to click on the three dots next to the viewpoint and select Edit or Delete.

Argument and Counter
Overview

Arguments are short passages submitted to support a viewpoint. Counters, on the other hand, aim to debunk a submitted argument. As the platform's name suggests, its primary goal is to identify the top arguments for various viewpoints on controversial topics. It also aims to identify the top counters for each submitted argument.

Topic pages, by default, display the top three arguments identified for each viewpoint. More arguments can be seen by clicking on Load More at the bottom of the list. The full list of arguments submitted for a viewpoint can be seen on the dedicated page for the viewpoint. Arguments are always sorted in descending order of strength based on evaluations submitted to the platform up to the present time.

Topic pages also show the top counter identified for each displayed argument. To see the full list of counters submitted for an argument beyond the top one, one can go to the dedicated page for the argument.

Users are free to submit arguments under any viewpoint they wish. However, in practice, they are more likely to submit arguments under the viewpoints they endorse. Likewise, users are more likely to submit counters for the arguments submitted under the viewpoints they disagree with.

Display

Arguments are displayed as yellow boxes below the respective viewpoint. The color yellow symbolizes that arguments aim to enlighten society. Counters, on the other hand, are shown as green boxes under the argument they address. The color green for counters is chosen intentionally and represents a good-faith and friendly attempt to highlight a point that the argument submitter might have overlooked. Maintaining a positive and friendly environment is one of the key priorities of the platform.

Submission

Users are prompted for the following information when submitting an argument or counter:

Title

The title should be carefully worded to communicate the gist of the argument. Note that the topic page, by default, only shows the titles of arguments and counters. The descriptions show up only after clicking the small arrows placed next to the titles. Therefore, it would be very helpful to select descriptive titles that allow users to get what the argument is all about in a short period of time. The maximum length of the title is 120 characters.

Description

The description section should include the details of the argument or counter and any possible references needed to support the claims made. We encourage users to spend quality time crafting logical, data-backed, and articulate arguments. This will help inform society more effectively and also increase the chances that the argument makes it among the top selected arguments. The description of an argument or counter can be at most 750 characters.

Source Type

The Source Type aims to clarify the type of information sources used in the argument or counter. There are two Source Types to choose from when submitting an argument or counter: Self-explanatory and Linked References. The self-explanatory type represents arguments that are supported by the principles of logic and do not need external references. In contrast, when selecting the source type Linked References, the argument submitter acknowledges that (i) certain parts of the argument require external references, and that (ii) they are linking those references to the submission. That's where the name Linked References comes from.

We recommend providing necessary external references as hyperlinks within the description of an argument or counter.

Anonymous

This is a check box. If checked, the platform does not disclose the identity of individuals who submit the argument or counter. Of note, the anonymity status can change at any time. For example, users may decide to initially submit their arguments anonymously, and later, if the submitted arguments make it among the top selected arguments, authors may disclose their identity then.

Editing and Deleting

The author of an argument or counter can edit its fields or delete it completely. To do this, they need to go to the dedicated page for the argument or counter, click on the three dots next to its title, and select Edit or Delete.

Ranking Algorithm
Introduction

A commonly used method for ranking content on online platforms is through up-vote and down-vote buttons. However, this approach has an important flaw: It may induce a bias toward early submissions. Those submissions will get more visibility and thus more votes. To avoid this issue, nlite adopts a completely different approach based on pairwise comparisons of randomly selected arguments.

How It Works

There is an Evaluate Arguments button beneath each viewpoint. Whenever a user clicks on this button, the platform itself selects two arguments at random from the pool of arguments submitted for that viewpoint. It then presents the selected arguments to the user and asks for their feedback on which one, they think, is stronger. These pairwise comparisons are aggregated in real time to identify the top arguments for each viewpoint.

The platform ranks arguments independently for each viewpoint, ultimately identifying the top arguments for all sides. Importantly, the popularity of a viewpoint does not play a significant role; what matters is the strength of its supporting arguments.

The platform also aims to rank the counterarguments submitted for each argument. The process is very similar and occurs through the Evaluate Counters buttons located beneath each argument.

When evaluating arguments or counterarguments, users should consider factors such as the accuracy of the data, logical consistency, and the clarity of expression.

Selection Mechanism and Ranking Speed

The algorithm currently used by the platform is pretty simple. In each iteration, it picks two arguments uniformly at random without keeping a history of previous selections. Despite its simplicity, the algorithm can be shown to be efficient at identifying the top arguments. In particular, it can be shown that the number of comparisons it takes to identify the top argument in a list of \( n \) arguments is of the order \( n\log(n) \). For technical details, please refer to the section titled Simple does it: eliciting the Borda rule with naive sampling in this paper by Lee et al.

Note that \( \log(n) \) grows very slowly for practical values of \( n \) which are expected not to exceed a few thousand. Therefore, one could say that in practical scenarios, the top arguments can be accurately identified as long as the number of pairwise comparisons increases roughly in proportion to the number of submitted arguments.

Argument and Counter Scores

The ranking algorithm calculates a quality score as a percentage between 0 and 100 for each argument and counter. These scores are continuously updated as new evaluations are submitted, and they are used to rank the submitted arguments and counters. In the current implementation, an argument or counter's score is simply defined as the fraction of pairwise comparisons it has won.

To prevent evaluators from being influenced by assessments made by others, the scores of arguments and counters are not disclosed while evaluations are in progress, i.e., before the Argument Evaluation Deadline. Users do not have the option to change this behavior (unlike their control over the visibility of authors of arguments and counters).

When an argument or counter is newly submitted, its score may not be reliable, a factor to consider when ranking. The current criteria for determining the reliability of an argument or counter is based on whether it has appeared in at least five pairwise comparisons. The platform adds the label "New" in front of the scores of arguments or counters who have not yet met this threshold; for example, Score: 0.85 (New).

To account for the unreliability of scores before the required threshold is met, the platform normalizes scores by a factor of \( \min(k, 5)/5 \), where \( k \) is the number of pairwise comparisons the argument or counter has participated in. Note that the normalizing factor smoothly disappears when the minimum threshold of five is met.

Redundancy Evaluation
Introduction

It is conceivable that a good argument may be submitted in various formats by different users. If the ranking algorithm functions correctly, all of these instances will be elevated to the top of the list, resulting in redundancy among the selected top arguments. To avoid this issue, the platform needs a mechanism to identify and remove duplicate arguments.

How It Works

Similar to ranking arguments, the platform primarily relies on its audience (i.e., the wisdom of the crowd) to detect redundancy among arguments. In particular, when the Evaluate Arguments button below a viewpoint is clicked, the platform, at times, replaces the standard question with: Are the following arguments (essentially) making the same point?

If the user responds positively to this question, a follow-up question appears, asking for feedback on which of the two presented arguments should be kept and which should be removed.

If a given argument X is voted by enough users to be covered by another argument Y, it will be eventually removed and no longer included in the ranking process.

Selection Mechanism

In the current implementation, the redundancy check is performed on the first five arguments. The idea is that unless an argument is deemed to be one of the strongest ones, we should not spend time checking its overlap with others.

Among the first five positions, higher-placed arguments have a higher chance of being selected for evaluation. The weight used for the selection of the argument at position \(i , 1 \leq i \leq 5 \) is heuristically set to \( \text{round}(1.5 ^ {\,(6 - i)}) \). This formula results in weights of \( [8, 5, 3, 2, 2]\). Note that the higher the position of an argument, the higher the chance it will be selected.

The two arguments are selected sequentially. First, an argument is chosen based on the five weights above. Then, it is dropped along with its weight, and the second argument is selected based on the remaining four weights.

Anonymity
Controlled by Authors

Authors always have full control over the anonymity of their submissions, including, topics, viewpoints, arguments, counters, argument evaluations, and counter evaluations. This control is achieved through a checkbox displayed when submitting or editing an element.

Of note, authors can choose to change the anonymity status of their submissions at any time. For example, a user can initially submit an argument anonymously and later, if the argument ends up among the selected top few arguments of the targeted viewpoint, the author may then decide to disclose their identity.

Controlled by Topic Owners

The user who creates a topic has the option to forcefully hide the authors of arguments and counters while evaluations are in progress. This feature aims to direct users' focus towards the content of submissions rather than who submitted them. To enable this feature, the topic creator needs to check a checkbox titled Hide argument authors while evaluations are in progress when creating the topic.

Of note, once the evaluation period is over (i.e., when the Argument Evaluation Deadline passes), this option loses its effect, and the anonymity of arguments and counters will only depend on the anonymity status chosen by their authors.

Life Stories and Comments

Each argument and counter page includes a Life Stories section and a Comments section. The Life Stories section provides a space for users to share their related personal experiences. The Comments section provides a space for users to share all their other thoughts.

These two sections are crucial parts of the platform, given the well-known impact of emotions on human decision-making. They help users get exposed to the real-world impacts of the arguments being submitted, as reported by firsthand witnesses.

Post-surveys

After a user has interacted with the platform's content, it would be useful to assess the extent to which their opinions may have changed. To achieve this, a quick survey is provided after the topic is closed. The survey is accessible via a link below the topic description on the topic page.

The survey includes a qualitative question and a quantitative one. The qualitative question is simply a multiple-choice question that asks about the extent to which the user's view may have shifted. The responses are Not at all, Just a little bit, Considerably, and Total transformation.

The quantitative question delves deeper into the extent to which the user's opinion has possibly changed. It asks for the user's degree of support for each of the listed viewpoints both before and after reviewing topic's content. Users are asked to provide their degree of support for each viewpoint as a percentage between 0% and 100%.

For example, imagine a topic with two viewpoints, Viewpoint 1 and Viewpoint 2. A user might initially lean 80% towards Viewpoint 1 and 20% towards Viewpoint 2. However, after exploring the platform, their support might shift to 40% for Viewpoint 1 and 60% for Viewpoint 2.

Overview