Wikipedia talk:Bots

From Simple English Wikipedia, the free encyclopedia
Jump to navigation Jump to search

Requests for the bot flag should be made on this page. This wiki uses the standard bot policy, and allows global bots and automatic approval of certain types of bots. Other bots should apply below. Global bot flag requests are handled at Meta:Steward requests.


If you are requesting bot status, please use the Current requests section. For other matters dealing with bots, please use the "discussion section".


Current requests[change source]

Put new requests at the top. Look here for a form to use to request bot status.


JarBot[change source]

  • Contributions
  • Operator: جار الله
  • Programming language: Python
  • Function: Create articles, Categories, Templates etc.
  • Description: I would like to create a stub articles based on wikidata and create and add categories and templates to articles based on enwiki. It gonna be a long way but with help form the community we will reach the goal. Example from arwiki by the bot.

--جار الله (talk) 17:54, 22 November 2020 (UTC)

I know you are a very trusted user elsewhere and JarBot is quite useful. However, if its pages are like Gylfi Einarsson you just created, I will say no, there are red categories (12) and unused template here which isn't helpful. One liner pages cannot help readers much. If the pages can be of more information, and better in quality, surely this can be an idea. Note that community typically don't like bot editing, see Wikipedia:Simple_talk/Archive_132#Gay_Yong_Hernandez_and_systematic_creation_of_stubs_from_IPs. Thanks for your help anyway. Camouflaged Mirage (talk) 18:11, 22 November 2020 (UTC)
Hello @Camouflaged Mirage: we can work on quality and information. and as I said the bot will create the categories and templates and I see that the Infobox is not linked to wikidata, I will work on it if I get the approval. there is a lot of work to do. and you can give me notes about required information and I see if I can add it to the page. also the articles not only about humans there will be articles about cities and information will be with different quality. regards--جار الله (talk) 18:34, 22 November 2020 (UTC)
@جار الله The categories here we typically don't create too much categories, we keep it to minimum and only when there are 3 pages, we create a category. As of templates, we try to create only when there are a lot of blue links etc, a totally redlink template is quite useless. I can't decide, this needs a crat, I am just commenting. Please also read through the links on the welcome message I left on your talkpage. Best, Camouflaged Mirage (talk) 18:51, 22 November 2020 (UTC)
Denied. Creating articles by bot on this wiki has been banned by community discussion in the past. It has caused us to be flooded with tiny one sentence stubs that were created just from data tables that were no help to anyone and caused a lot of work deleting many thousands of them in cleanup.. -Djsasso (talk) 16:45, 23 November 2020 (UTC)
@Djsasso: Okay, what about create and add categories if there is three or more pages need the category? also added category to the articles based on enwiki if the category exist.--جار الله (talk) 17:24, 23 November 2020 (UTC)

MajavahBot[change source]

  • Contributions
  • Operator: Majavah
  • Programming language: pywikibot, sql
  • Function: Create redirects from "NAME (film)" to "NAME (movie)"
  • Description: Creating these kind of redirects would allow just changing en. to simple. on the URL when "jumping" from enwiki to here. Naleksuh previously had a script to do this but he hasn't run it recently and the amount of missing redirects has already gone over 50. I've written my own script to do exactly the same.

--Majavah (talk) 14:38, 11 October 2020 (UTC)

Greetings. I have been away from home for a while now. However, I have since returned and will create film->movie redirects now. No prejudice against discussing new owners or tasks once it is done. Naleksuh (talk) 19:44, 13 October 2020 (UTC)
I am not sure we even need a bot to do this. As was mentioned in the original discussion for this, it is something that probably should be done by hand as needed. In the first instance we allowed the script because there was so many and it just made things faster but I am not sure we would want it to continually happen with a bot account. -Djsasso (talk) 17:44, 14 October 2020 (UTC)
Well while Naleksuh was inactive, there were over 50 that needed doing. It is clear this is not being done manually by anyone and will simply never be created without some sort of script. That is worth noting. --IWI (talk) 10:57, 15 October 2020 (UTC)
To me, this is a "nice-to-have", not something we really need, so saying that they "needed doing" is overstating things. --Auntof6 (talk) 11:01, 15 October 2020 (UTC)
True. It is not exactly a very important task that needs doing. --IWI (talk) 11:03, 15 October 2020 (UTC)
Yeah, Auntof6 states what I was trying to say better. Creating these redirects is a nice thing, but not at all a necessary thing. No more so than many other tasks we have. ie We have articles of all types that have different names than their English counter parts. The languages sidebar makes switching between the en and simple version quick and easy. While I do often just switch the URL there really is a solution for this already on the side of the page. -Djsasso (talk) 16:14, 15 October 2020 (UTC)
It is difficult to do by hand, on a wiki with 175k articles you would need an automated process to find what titles to create. And at that point, you might as well create them automatically as well. That said, what we currently have is semi-automated, and I don't see a clear need to make it fully automated. Naleksuh (talk) 01:12, 18 October 2020 (UTC)

InternetArchiveBot[change source]

  • Contributions
  • Operator: Cyberpower678
  • Programming language: PHP
  • Function: Fix dead links, blue book refs
  • Description: This bot will scan Wikipedia looking for dead links. Dead links will be supplemented, or replaced, with archive URLs. A feature currently in the works, the bot will look for unlinked book references and a link to a preview of the book, or page if mentioned in the reference.

CYBERPOWER (Chat) 19:54, 11 September 2020 (UTC)

Approved for trial (50 edits). --Chenzw  Talk  16:50, 12 September 2020 (UTC)
Chenzw, The bot is running into captcha requests.—CYBERPOWER (Chat) 19:45, 12 September 2020 (UTC)
Trial complete.—CYBERPOWER (Chat) 00:50, 13 September 2020 (UTC)
 Approved. Apologies, completely slipped my mind that the edits were going to trigger the captcha. Flag granted. Chenzw  Talk  11:44, 13 September 2020 (UTC)

DaedanBot[change source]

— Gomdoli (talk) 01:03, 31 August 2020 (UTC)

Is this "one-time" or continuous, automatic, constant check and retarget for any category that has been moved? Naleksuh (talk) 01:19, 31 August 2020 (UTC)
It's not automatic, but I'm planning to run when there's a new redirect category. — Gomdoli (talk) 01:29, 31 August 2020 (UTC)
In that case you may be looking for WT:AWB/CP. For the african-american pages though , I would say there's few enough that it could be done manually. Naleksuh (talk) 01:31, 31 August 2020 (UTC)
Thank you for explaining. I will use this bot account. — Gomdoli (talk) 02:53, 31 August 2020 (UTC)
There is already a bot that does this automatically without needing to be run manually. If you are talking about a specific category that isn't cleared yet it is because any new category moves it has a cool down period to wait 7 days after a move to do it in case the move is reverted, which is the case of with the African Americans pages. This is to prevent unnecessary edits moving the pages back and forth between categories. -Djsasso (talk) 11:05, 31 August 2020 (UTC)
You can see its log of work here. I am inclined to decline this request as not being necessary, and since generally we don't hand out AWB just to be run on an adhoc basis on this wiki. -Djsasso (talk) 11:20, 31 August 2020 (UTC)
This is not a one-time plan. It's not automatic, but I try to run steadily. — Gomdoli (talk) 23:47, 31 August 2020 (UTC)
Yes, I am aware, my point was we have a bot that already does it automatically (it runs every day, so articles never sit in a redirected category more than a day unless its a recently moved category where they wait 7 days), so we don't need someone who does it manually. -Djsasso (talk) 11:16, 1 September 2020 (UTC)
@Auntof6: — Gomdoli (talk) 00:22, 1 September 2020 (UTC)

X mark.svg Not done I see no benefit in duplicating a process that's already done by an automated bot. I'm also reluctant to give bot permission to a user who has been editing on this wiki for only a little more than two weeks. --Auntof6 (talk) 00:41, 1 September 2020 (UTC)

I'll request when I need it next time. — Gomdoli (talk) 00:45, 1 September 2020 (UTC)
@Auntof6: Is it not the case that only bureaucrats can review these requests, or does that only apply to approvals? --IWI (talk) 00:50, 1 September 2020 (UTC)
Good question. I forgot that it takes a crat to grant that right. I'll strike my response and let a crat respond. However, I'll leave the account blocked. --Auntof6 (talk) 03:36, 1 September 2020 (UTC)
Yes it can only be a crat. I guess I should make my earlier comment more clear with a template X mark.svg Not done. -Djsasso (talk) 11:16, 1 September 2020 (UTC)

ChenzwBot[change source]

  • Contributions
  • Operator: ChenzwBot
  • Programming language: Python (custom library)
  • Function: Maintenance of Wikipedia:Good articles/by date
  • Description: This proposed new task involves automatic population/removal of entries from Wikipedia:Good articles/by date - the current manual workflow involves very tedious renumbering of items in the table, with the process becoming even more painful if the oldest GAs are being demoted. I don't anticipate needing a bot flag for this, and the bot will mark such edits as non-bot explicitly. The code hasn't been written yet, but this is not expected to take long because the MediaWiki API functionality is already in a library that was created for the bot's anti-vandalism work (so that's half the code already done), and requesting for "in-principle approval" first before writing of the remaining bot code starts, just in case there is a serious objection for this particular bot task.

--Chenzw  Talk  16:40, 29 August 2020 (UTC)

When does this bot decide to edit the table? Does it watch the status of GA tags themselves, or is it semi-automated/operator input? Naleksuh (talk) 23:36, 29 August 2020 (UTC)
The current intention is to run the task on a schedule. The task will check the category members and compare it against the GA list. Chenzw  Talk  03:22, 31 August 2020 (UTC)
Yep go for it. -Djsasso (talk) 11:08, 31 August 2020 (UTC)

Brantmeierz-bot[change source]

  • Contributions
  • Operator: Brantmeierz
  • Programming language: Java (using a slightly modified version of https://github.com/MER-C/wiki-java)
  • Function: Redirect management
  • Description: I've been spending a lot of time manually creating and categorizing redirects: many of which follow rules that could be easily identified and performed automatically. In addition, something that I haven't been adding while manually working on redirects is Template:Rcat shell which helps a lot with the clarity of the pages and grouping those with multiple redirect reasons. My bot would perform these behind-the-scenes tasks only on redirect pages.

--Brantmeierz (talk) 03:03, 22 January 2020 (UTC)

  • You will need to list out exactly what edits the bot will be making. Now I understand the Rcat shell one but you imply there are others. What are they? -DJSasso (talk) 10:43, 22 January 2020 (UTC)
Some I had in mind to apply, and their logic, were:
  • Template:R from other capitalization; guaranteed to identify (if the source and destination page titles were different, but equal when both forced lowercase they must differ in capitalization)
  • Template:R to section; guaranteed to identify (check if destination is linked to a section/contains #)
  • Template:R from specific geographic name; fairly easy to identify (look for redirects where destination page is a location and the page title is {Destination}, {Other identifiers})
  • Template:R from acronym; wouldn't identify all of them, but a conservative method would be finding redirect pages where the destination page consists only of words beginning with each consecutive letter of the source page's name
Others like R from other spelling, plurals, etc. would definitely be possible to identify using a dictionary file or APIs but since that has a much greater ability to produce erroneous results it's not something I would attempt with a bot yet. Brantmeierz (talk) 16:02, 22 January 2020 (UTC)
Personally I don't think redirects should be categorized on simple at all, regardless of whether it's by a bot or a human, just doesn't seem the simple way to me. If it is done though, I guess it won't really matter which route is done. Computer Fizz (talk) 08:47, 23 January 2020 (UTC)
Since it’s mostly behind the scenes, more like maintenance categories and not related to page content, I don’t see why the practices should differ much between enwiki and simple (except for some regrouping/renaming; the categorization system here is a lot less technical than enwiki). For the parts where it does affect readers (such as if they happen to stumble across the redirect pages themselves, which will definitely happen sometimes) it helps with readability and navigation to have what’s going on be explained instead of them only being presented with the technical wiki markup for a redirect. Brantmeierz (talk) 16:14, 23 January 2020 (UTC)
@Computer Fizz: From a management side, ignoring the user perspective, it’s nice to have the pages bound to something, since in a lot of cases the only links to them will be found on the “What links here” section of any given page. Having them categorized with similar pages helps make them more searchable than if they were floating around without any page links, and is especially helpful for restructuring or renaming things since they can be identified as a batch with similar properties. Brantmeierz (talk) 16:18, 23 January 2020 (UTC)
Yeah our category system is intended to be a lot more simple than the one on en.wiki. Having a complex category system can end up being unsimple. I do understand where you are coming from so you aren't out in left field or anything. Personally I don't care one way or the other if we categorize redirects. I just know I am not likely to do it myself because I doubt anyone would ever really look at the categories of redirects. -DJSasso (talk) 16:33, 23 January 2020 (UTC)
  • Approved for trial (100 edits). Ok I will approve a trial for only the 4 tasks above and Rcat shell. I would like to see 100 edits with some edits of each kind. Should you want to add other tasks in the future you would need to come back here for additional approval. -DJSasso (talk) 16:33, 23 January 2020 (UTC)
Isn't there an abuse filter called "redirect page with extra text"? If we're categorizing them now, should that be deleted? Computer Fizz (talk) 17:07, 23 January 2020 (UTC)
Filter 36, but it can easily be changed to include the bot group as well. rollingbarrels (talk) 17:10, 23 January 2020 (UTC)
We have been categorizing them for a long time. Just no one has wanted to do it to the scale he is suggesting. It isn't new. But yes, I will likely add the bot group. (although probably not needed because the bot would be autoconfirmed) -DJSasso (talk) 17:11, 23 January 2020 (UTC)
Abuse filter 36 was designed to tag edits where an unsuspecting user adds content to a redirect, without realising that it would not be visible. If you were to look at the filter rules you will see that all autoconfirmed users will not trip the filter. And as for non-autoconfirmed users causing a false positive if they attempt to categorise redirects, I think the chance of that happening is very low, and it is far more likely that non-autoconfirmed users are (mistakenly) trying to add content in a redirect page. Chenzw  Talk  13:36, 24 January 2020 (UTC)
Yep that is why I said not needed because it would be autoconfirmed. And you are right very unlikely a non-autoconfirmed user would be categorizing articles legitimately. And it only tags an article, doesn't prevent it anyway. -DJSasso (talk) 13:41, 24 January 2020 (UTC)

User:DYKUpdater[change source]

  • Contributions
  • Operator: FNAFPUPPETMASTER
  • Programming language: Pywikibot
  • Function: Simplifying / Automating the DYK panel's on the main page, archiving recent ones, and clearing the replaced ones.
  • Description: This bot would making the updating process of the DYK simpler, by running every week when it needs to be replaced — Preceding unsigned comment added by FNAFPUPPETMASTER (talkcontribs)
Two issues here: 1) have you already written the bot code for this task? 2) there is currently no editor activity on DYK. Until activity picks that up, a bot is effectively useless. Chenzw  Talk  07:17, 27 February 2020 (UTC)
Not only that but the DYK system is automatic on its own already with a queue system. -DJSasso (talk) 11:38, 27 February 2020 (UTC)
Then perhaps just moving this into a user script? rollingbarrels (talk) 15:43, 27 February 2020 (UTC)

Policy Discussion[change source]

Request a new bot[change source]

Request a new bot[change source]

Bot for blocking open proxies (APIs and examples can be provided, but I am not allowed to run such a bot, that would require admin flags). Naleksuh (talk) 03:25, 24 July 2020 (UTC)

This would need to be evaluated very carefully, and would require broader community approval due to automated admin actions. How would this bot work, and what is the codebase? Chenzw  Talk  10:21, 24 July 2020 (UTC)
Finding IPs that edit from eventsource and run them through the same checks the Toolforge tool uses (i.e. https://ipcheck.toolforge.org/index.php?ip=185.220.101.27 ). There currently isn't a "codebase" as 1) I am not allowed to operate bots that block without RfA'ing 2) This is "request" a new bot not "create" a new bot. That said, I do have experience with similiar bots in the past Special:PermanentLink/7044544#Second_Flood_Request and if this task is approved it can be created by me (likely in node). Naleksuh (talk) 19:58, 24 July 2020 (UTC)
So what you'd like is a new bot function rather than a whole new bot? If an existing bot could perform the function, would that suffice? --Auntof6 (talk) 22:27, 24 July 2020 (UTC)
@Auntof6: I'm not really sure what the difference would be, as no bots currently do anything relating to proxies. If you mean, does there need to be a whole new account, not necessarily, lots of our bots do multiple tasks.
I have been asked by multiple people for code so far (chenzw asked what the "codebase" was right above, oper asked me on irc to write code for this bot that he can review). I have been under the impression that I would not be allowed to run a bot that blocks, but since multiple admins have asked me to create it, that may not be the case. Let me know. I am willing to help any admins interested in creating such a bot (or "bot function" as Auntof6 says). Naleksuh (talk) 23:28, 24 July 2020 (UTC)
I am not sure what the confusion is. For the benefit of the wider community, in my previous conversation with Naleksuh on Discord, I stated that since examples could be provided (from above), the community would probably want to see actual bot code before approving this particular use case for a bot. Chenzw  Talk  02:10, 25 July 2020 (UTC)
Here is the description that I gave to @Chenzw: off-wiki : "essentially, whenever an ip edits, it is checked through several proxy apis, and if it is determined to be a proxy, it is blocked. ". Now that we are on-wiki, some questions should be answered:
1. How long will the block be? (My opinion: 1 month, this is the standard length and if it still reports to be a proxy at the end it can just block for another month)
2. Should it revert the IP's edits? Or just block (My opinion: Just block, the use of a proxy alone does not guarantee an LTA (other, behavioral evidence is required that a bot can't find) and it doesn't require an admin to revert, only block)
3. Should it post a block notice on the IPs talk page? (My opinion: No, this is mainly intended for an LTA who frequently visits old talk pages)
Other than this, descisions are probably left up to the person operating the bot. 'quick example' can be found at https://gist.github.com/Naleksuh/914ab4ac3b0277e44c29c862383974e5 - although this only includes one service and does not block the associated proxies (that is in progress). Hope this clears up anything! Naleksuh (talk) 03:48, 25 July 2020 (UTC)
If we were going to do this, and I am not entirely sure we should get into having admin bots on simple. We should just see if we can't get a copy of the one operating on en.wiki to operate here if that botop is comfortable giving it to one of the admins here who are experienced in running bots. (they may not be because of the special sauce used to find said proxies and wanting to keep it out of the wrong hands) -Djsasso (talk) 22:45, 28 July 2020 (UTC)
@Djsasso: Would you be referring to User:ProcseeBot or User:ST47ProxyBot? Also, both of these bots do not use rc-relayed information and they both miss tons of proxies (I would say about 90%+ of the editing proxies), so using them here would be very ineffective. Naleksuh (talk) 06:13, 31 July 2020 (UTC)

Important: maintenance operation on September 1st[change source]

User:Trizek (WMF) (talk) 10:30, 31 August 2020 (UTC)

I would like to create a new bot[change source]

Hello, I’m Astronomyscientist124 and I am here to request a new bot on the Simple English Wikipedia. I would like it to be called AstronomyBot and it’s task is to block vandals, create articles on science and make users administrators. I would like to operate the bot if that is okay, thanks.Astronomyscientist124 (talk) 05:09, 12 October 2020 (UTC)

I am not a crat, but none of those are tasks we would want a bot to do. Blocking users and making users admins requires admin and crat rights, respectively, and those rights are not given to bots. We also don't want bots to create articles. --Auntof6 (talk) 05:56, 12 October 2020 (UTC)

Important: maintenance operation on October 27[change source]

Please help translate to your language Thank you.

This is a reminder of a message already sent to your wiki.

On Tuesday, October 27 2020, all wikis will be in read-only mode for a short period of time.

You will not be able to edit for up to an hour on Tuesday, October 27. The test will start at 14:00 UTC (14:00 WET, 15:00 CET, 10:00 EDT, 19:30 IST, 07:00 PDT, 23:00 JST, and in New Zealand at 03:00 NZDT on Wednesday October 28).

Background jobs will be slower and some may be dropped. This may have an impact on some bots work.

Know more about this operation.

-- User:Trizek (WMF) (talk) 09:25, 26 October 2020 (UTC)