Wikipedia talk:Bots

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search
  • [[:]]

Requests for the bot flag should be made on this page. This wiki uses the standard bot policy, and allows global bots and automatic approval of certain types of bots. Other bots should apply below. Global bot flag requests are handled at Meta:Steward requests.


If you are requesting bot status, please use the Current requests section. For other matters dealing with bots, please use the "discussion section".


Current requests[change source]

Put new requests at the top. Look here for a form to use to request bot status.

ChenzwBot[change source]

As a result of changes in the wikitext parser, links prefixed with more than one colon are now invalid, and will appear as plaintext instead. Special:LintErrors estimates 30,000+ instances of such issues. This new task will have the bot run in AWB's automatic mode to correct instances of [[:: to [[:. I corrected a few such issues while flagging myself as bot, but there are simply too many pages (about 8000+, based on the database dump dated 20 April) to deal with. This should be done as a bot (and not flooder) because majority of these issues are on user talk pages, and only the bot user group has the nominornewtalk permission (suppresses new message notifications when making a minor edit). --Chenzw  Talk  05:36, 5 May 2018 (UTC)

Reasonable, go ahead and fix them. -DJSasso (talk) 10:43, 7 May 2018 (UTC)
Task complete, remaining issues are related to {{softredirect}}, which requires a modification of the template. Chenzw  Talk  11:51, 22 May 2018 (UTC)

PSLBot[change source]

  • Contributions
  • Operator: Psl631 (talk|contributions)
  • Programming language: Lua
  • Function: Reverting vandalism, reporting vandals, removing spam links
  • Description: Using this to revert vandalism, add user warning to a vandal/spammer's talk page, and report the user if the user have got final warning.

Psl631 10:38, 17 March 2018 (UTC)

  • @Psl631: What library is the bot using, and how does the bot intend to detect vandalism? Chenzw  Talk  08:44, 1 May 2018 (UTC)
  • @Chenzw: I did not know which "libary" is for bots, what is it? The bot can detect vandalism if the bot see a word "butt", "ass", "your mom" and so, It will revert the edit. Then, can the bot add a warning on the user's talk page, and after final warning, it can block the user from editing or report it to the WP:VIP/B page. When can I create the bot? Other questions, askin the "Comments" section. -- Psl631 talk 13:16, 1 May 2018 (UTC)
    • I am not sure how you can possibly program a MediaWiki bot if you do not know what a "library" is. Chenzw  Talk  14:52, 1 May 2018 (UTC)
      • The comments in this discussion have already show to me you don't have the knowledge level required of a bot operator. I would vote No on this one. Bots can only be run by extremely experienced editors and people who have a good knowledge of programming. It is not just a push button thing. Running a bot is complicated. -DJSasso (talk) 17:44, 1 May 2018 (UTC)
    • Non-crat comment:@Psl631: You can't blindly revert when you see words like that. You have to look at context. --Auntof6 (talk) 18:12, 1 May 2018 (UTC)
      • Denied. Not trying to claim a monopoly on the automated anti-vandalism business here, but prospective entrants need to show some degree of competency in programming. Chenzw  Talk  17:25, 5 May 2018 (UTC)

KolbertBot[change source]

  • Contributions
  • Operator: Jon Kolbert (talk · contribs)
  • Programming language: Python
  • Function: Replace unsecured http:// links with https:// if supported.
  • Description: Update http:// links to https:// links, to make links more secure, as most sites default to HTTPS anyway, an HTTP link it would go HTTPS→HTTP→HTTPS (from Wikipedia, to http://x.site, to https://x.site) instead of HTTPS->HTTPS (from Wikipedia, to https://x.site) which is takes less time to load. Also, by using HTTPS there will be no information left in the HTTP header, which is better for privacy and security. Jon Kolbert (talk) 13:42, 12 August 2017 (UTC)
    Have you run this task anywhere else and how are you determining what links to change (ie how are you deciding an https page actually works rather than just resolves). Go ahead and do 50 trial edits. -DJSasso (talk) 16:24, 19 August 2017 (UTC)
    This task is currently approved to run on the English Wikipedia. Each domain is manually reviewed to ensure https works before being added to the list. So far the list is mainly made up of major news websites. I have run a trial on my main account, as the bot account kept getting tripped up by CAPTCHAs. Jon Kolbert (talk) 16:50, 21 August 2017 (UTC)
    @Djsasso: Ping Jon Kolbert (talk) 02:21, 27 August 2017 (UTC)
    Sorry I was away on vacation. I will take a look at your edits shortly and let you know. -DJSasso (talk) 12:37, 29 August 2017 (UTC)
     Approved. Looks good just be sure to review those domains are reviewed prior to adding to the list. -DJSasso (talk) 12:16, 30 August 2017 (UTC)

TohaomgBot[change source]

— This unsigned comment was added by ‎ Tohaomg (talk • changes) at 20:20, 22 July 2017.

@Tohaomg: Why would that task need to be done? --Auntof6 (talk) 15:58, 23 July 2017 (UTC)

Generally, SVG images are better than raster ones, because raster images can have issues like compression artifacts, blurs, small resolutions, granularity, non-transparancy, etc. --Tohaomg (talk) 19:12, 23 July 2017 (UTC)
Approved for trial (50 edits). Go slow with the trial and I will take a look. -DJSasso (talk) 12:50, 24 July 2017 (UTC)
Denied. Bot owner didn't feel like doing a trial. As such his bot has been denied. -DJSasso (talk) 11:33, 25 July 2017 (UTC)

JJMC89 bot[change source]

This task was requested because the route diagram templates don't use syntax recognized by global replace / CommonsDelinker. This task has been approved on English Wikipedia and running since February 2017. — JJMC89(T·C) 02:32, 12 July 2017 (UTC)

Approved for trial (50 edits). Go slow with the trial and then I will take a look. -DJSasso (talk) 03:21, 12 July 2017 (UTC)
Trial complete. Only 23 edits we done since those were the only edits needed at this time. If preferred this could be run without a bot flag. — JJMC89(T·C) 04:37, 13 July 2017 (UTC)
 Approved. Yeah I think I will approve it without a flag as it doesn't look like there will be many edits. If you (or another editor) notice that it starts doing so many that it floods recent changes let me know and i will flag it. -DJSasso (talk) 10:51, 13 July 2017 (UTC)

AJBot[change source]

  • Contributions
  • Operator: Alexis Jhon Gaspar
  • Programming language: JavaScript,PyWikipedia
  • Function: a hand with a robot chip then a bluetooth wireless device a WIFI and Internet,With a screen of Wikipedias and a computer chip
  • Description: This is edit and revert arcticles from mistakes and not inculde image,videos,gif and MIME typed files and Blocking Shockpuppets,Vandas,and Violated the Rules

--__Alexis Jhon Gaspar (talk) 07:52, 4 May 2017 (UTC)

Denied. Operator does not appear to understand the basics of operating a bot account. Chenzw  Talk  13:03, 4 May 2017 (UTC)

HammadBot[change source]

  • Contributions
  • Operator: Hammad Saeed
  • Programming language: Pywikipedia
  • Function: Categorize articles,Remove mistakes
  • Description: Remove mistakes from articles and categorize aricles.

-Hammad SaeedAnimalibrí.gif(talk) 05:59, 19 March 2017 (UTC)

  • "Categorize articles" and "remove mistakes" are two separate tasks. Please clarify what exactly this bot will be doing. Chenzw  Talk  06:09, 19 March 2017 (UTC)
You will have to be more specific than that - what exactly do you want to categorise, and what mistakes will the bot be fixing? Chenzw  Talk  06:43, 19 March 2017 (UTC)
Sir, mostly i'll fix islamic articles(may be other) and their related issues like add references and add category to the articles,add citations if needed and correction too.--Hammad SaeedAnimalibrí.gif(talk) 07:04, 19 March 2017 (UTC)
Denied. This is not something that a bot should be doing. Chenzw  Talk  07:20, 19 March 2017 (UTC)

Policy Discussion[change source]

Wikidata Phase 1: Interlanguage links (interwikis)[change source]

The way interwiki links are handled is being changed. Instead of coding them on each individual article, most of them will be maintained in Wikidata. You can read more here. In view of this, what will we need to do about the bots we have that maintain interwikis links? Discontinue them? Convert them to remove links from pages where possible? Discuss. :) --Auntof6 (talk) 14:00, 25 February 2013 (UTC)

Pywikipedia which most of them run on has been updated to avoid wikis which now are part of the wikidata system and there are some bots that will remove the links once they have all been imported to wikidata. Likely this will make a number of the current bots inactive as most won't bother continuing after this change. Eventually I will be removing bots that are inactive per our normal procedure of removing inactive bots. But just to be clear wikidata does not automatically make interwiki bots depricated. I should also note we are simple have not yet been migrated over so bots will still be operating here as normal. -DJSasso (talk) 14:57, 25 February 2013 (UTC)

Request a new bot[change source]

Bot job request -- identify long stub articles[change source]

I often find articles tagged as stubs that are really long enough not to need the tag. Does anyone have a bot that could look at all the pages tagged as stubs, and make a list of the ones that are over a certain length? That way, I (or anyone else interested) could look at the articles to see if the stub tags could be removed. It would be great if the bot could omit tables, infoboxes, and navboxes from the character count, but even including them would be helpful. Thanks. --Auntof6 (talk) 05:32, 31 March 2013 (UTC)

Stub doesn't just mean short. A stub can be quite long and still be a stub. Stub just means an article that is missing something important. AWB of course used to treat it just as a character count but I do believe that functionality was removed because it was of course incorrect. It really does need a human eye to decide if it should be a stub or not. -DJSasso (talk) 12:43, 1 April 2013 (UTC)
I agree. This would just be a starting point. I'm asking for a list, not for a bot to make the changes. --Auntof6 (talk) 13:04, 1 April 2013 (UTC)
Yeah was just clarifying for anyone that might take care of this. -DJSasso (talk) 14:14, 1 April 2013 (UTC)
The easiest way to do this might be to query the replicated database on Toolserver. Since I'm not nearly good at this (though I know someone who is :P), do you happen to have any idea of the minimum size to include? If it turns out that a db query isn't the easiest way, I could try to write something in Python to do this.  Hazard-SJ  ✈  02:24, 18 April 2013 (UTC)
Via the db I got User:Hazard-Bot/Long stubs.  Hazard-SJ  ✈  06:21, 21 April 2013 (UTC)
Thanks! I'll take a look at those. --Auntof6 (talk) 23:51, 25 April 2013 (UTC)

┌─────────────────────────────────┘
I think there are two different concepts involved here: The first is that an article shorter than a given size (say 2k characters) is unlikely to fully treat a subject, if it is not a redirect, or has been proposed for merging. This category is probably easy to find, and can easily be handled by a bot. The other category is that a human editor, ideally an expert in the subject, identifies that an article does not cover the subjects it should, in the necessary detail. For the first case, it would also be possible to write an edit filter that tags the articles. Edit filters are different from bots in that they only trigger when an article is edited, though. Tags have the benefit that they can be filtered for in the recent changes list. They have the drawback though that they won't show the articles that aren't edited. --Eptalon (talk) 07:41, 30 June 2013 (UTC)

I started looking at these back when the list was produced. I found that most of them were only long because they had a lot of tables, infobox data, etc., so I didn't find any (so far) that could be de-stubbed. --Auntof6 (talk) 08:21, 30 June 2013 (UTC)
It's not possible to access the page text from the database, so I'd have to modify the script to load the text separately and try to exclude such things. I'll see what I can do.  Hazard-SJ  ✈  02:11, 3 July 2013 (UTC)
Well, OK, if it's not much trouble. I didn't want to cause a lot of work here. --Auntof6 (talk) 02:40, 3 July 2013 (UTC)
Please take a look at what I have so far (you'll have to sort it by "Length" for now). That length is the number of characters, and it excludes tables using the regular syntax or s-start/s-end templates, as well as any template with "box" or "list" in the title.  Hazard-SJ  ✈  01:03, 10 July 2013 (UTC)
Thanks -- the first few I checked turned up some that I think can be de-stubbed, so I'll work some more with that list in a bit. Thanks for taking the time for this! --Auntof6 (talk) 01:34, 10 July 2013 (UTC)

Bot job request=Add template to pages[change source]

At Template:Chess, there are lots of pages to which the template links to. There are a lot of pages in which the template isn't applied onto. It will take a lit of time doing this. Can I please have a bot to do this? PokestarFan (talk) 22:09, 21 January 2016 (UTC)

Not a good job for an automated bot, but I will do it with AWB. Stand by. --Auntof6 (talk) 22:51, 21 January 2016 (UTC)
OK, done. Let me know if it doesn't look the way you expected. --Auntof6 (talk) 23:02, 21 January 2016 (UTC)

Bot job request-Identify and make lists of pages[change source]

I want a bot or tow (or even three) to generate lists of articles that are about a city in general. What I said is confusing, so breaking it down. Have you noticed that pages such as [[Category:Communes in Aube]] have over 200 articles that are all communes? I want bots to find all(or most) of these sections and make lists located as User:PokestarFan/Bot Requests/Lists. Then I can approve them. After that, I want a bot to make navboxes of these pages and put them in every page. Is this possible? PokestarFan (talk) 03:00, 6 February 2016 (UTC)

Approve? What do you need to approve? Also, I am not sure if it is a good idea to be making even more navboxes. (ping Auntof6) Chenzw  Talk  03:10, 6 February 2016 (UTC)
Yes, please don't go overboard creating navboxes. Especially don't create them for things that have a large number of entries like the category you mention: when you do that, the navbox template takes over the articles it's in. Not everything needs a navbox, and we already have a lot of navboxes that we don't really need. Categories often do the job much better, and we've deleted navboxes in the past for that reason. I'd like to suggest this: if you see a need for a navbox, run the idea past someone before you create it. I'm suggesting this to try to avoid getting your work deleted again, and to keep people from thinking you're continuing to be disruptive.
Besides that, a bot probably can't do what you're asking. --Auntof6 (talk) 03:41, 6 February 2016 (UTC)

QD bot[change source]

The bot that automaticly update the three new page in {{QD picker}}. Wielbiciel Papieża (talk)

Not going to happen since that is a system that should not be implemented on this wiki. -DJSasso (talk) 03:17, 22 October 2017 (UTC)

make bot for adding articles[change source]

please make bot for adding articles example sv.wikipedia sr.wikipedia.org you can use bots for adding articles for example ljs bot

Not really sure what you are asking for here. Do you mean take articles from other wikis and move them here? We had people do that in the past and the wiki has since declared that we don't want that to happen here. -DJSasso (talk) 17:17, 16 January 2018 (UTC)

Make a bot for marking pages that are complete vandalism with {{QD}}[change source]

Please make a bot which automatically marks pages that are complete vandalism or breaking the rules with {{QD}}. --Psl85 Talk 17:21, 4 March 2018 (UTC)

Not likely to happen, humans are needed for this. -DJSasso (talk) 18:48, 5 March 2018 (UTC)