Wikipedia talk:Bots

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Shortcut:
WT:BOTS
WT:B
WP:BRFA

Requests for the bot flag should be made on this page. This wiki uses the standard bot policy, and allows global bots and automatic approval of certain types of bots. Other bots should apply below. Global bot flag requests are handled at Meta:Steward requests.


If you are requesting bot status, please use the Current requests section. For other matters dealing with bots, please use the "discussion section".


Current requests[change source]

Put new requests at the top. Look here for a form to use to request bot status.

KolbertBot[change source]

  • Contributions
  • Operator: Jon Kolbert (talk · contribs)
  • Programming language: Python
  • Function: Replace unsecured http:// links with https:// if supported.
  • Description: Update http:// links to https:// links, to make links more secure, as most sites default to HTTPS anyway, an HTTP link it would go HTTPS→HTTP→HTTPS (from Wikipedia, to http://x.site, to https://x.site) instead of HTTPS->HTTPS (from Wikipedia, to https://x.site) which is takes less time to load. Also, by using HTTPS there will be no information left in the HTTP header, which is better for privacy and security. Jon Kolbert (talk) 13:42, 12 August 2017 (UTC)
    Have you run this task anywhere else and how are you determining what links to change (ie how are you deciding an https page actually works rather than just resolves). Go ahead and do 50 trial edits. -DJSasso (talk) 16:24, 19 August 2017 (UTC)
    This task is currently approved to run on the English Wikipedia. Each domain is manually reviewed to ensure https works before being added to the list. So far the list is mainly made up of major news websites. I have run a trial on my main account, as the bot account kept getting tripped up by CAPTCHAs. Jon Kolbert (talk) 16:50, 21 August 2017 (UTC)
    @Djsasso: Ping Jon Kolbert (talk) 02:21, 27 August 2017 (UTC)
    Sorry I was away on vacation. I will take a look at your edits shortly and let you know. -DJSasso (talk) 12:37, 29 August 2017 (UTC)
     Approved. Looks good just be sure to review those domains are reviewed prior to adding to the list. -DJSasso (talk) 12:16, 30 August 2017 (UTC)

TohaomgBot[change source]

— This unsigned comment was added by ‎ Tohaomg (talk • changes) at 20:20, 22 July 2017.

@Tohaomg: Why would that task need to be done? --Auntof6 (talk) 15:58, 23 July 2017 (UTC)

Generally, SVG images are better than raster ones, because raster images can have issues like compression artifacts, blurs, small resolutions, granularity, non-transparancy, etc. --Tohaomg (talk) 19:12, 23 July 2017 (UTC)
Approved for trial (50 edits). Go slow with the trial and I will take a look. -DJSasso (talk) 12:50, 24 July 2017 (UTC)
Denied. Bot owner didn't feel like doing a trial. As such his bot has been denied. -DJSasso (talk) 11:33, 25 July 2017 (UTC)

JJMC89 bot[change source]

  • Contributions
  • Operator: JJMC89 (talk · contribs)
  • Programming language: Python; source
  • Function: Replace BSicons
  • Description:
    Replace BSicons.
    The config (global) has a blacklist and a whitelist. (The local config is only needed if the global config needs to be overwritten for this wiki.)

This task was requested because the route diagram templates don't use syntax recognized by global replace / CommonsDelinker. This task has been approved on English Wikipedia and running since February 2017. — JJMC89(T·C) 02:32, 12 July 2017 (UTC)

Approved for trial (50 edits). Go slow with the trial and then I will take a look. -DJSasso (talk) 03:21, 12 July 2017 (UTC)
Trial complete. Only 23 edits we done since those were the only edits needed at this time. If preferred this could be run without a bot flag. — JJMC89(T·C) 04:37, 13 July 2017 (UTC)
 Approved. Yeah I think I will approve it without a flag as it doesn't look like there will be many edits. If you (or another editor) notice that it starts doing so many that it floods recent changes let me know and i will flag it. -DJSasso (talk) 10:51, 13 July 2017 (UTC)

AJBot[change source]

  • Contributions
  • Operator: Alexis Jhon Gaspar
  • Programming language: JavaScript,PyWikipedia
  • Function: a hand with a robot chip then a bluetooth wireless device a WIFI and Internet,With a screen of Wikipedias and a computer chip
  • Description: This is edit and revert arcticles from mistakes and not inculde image,videos,gif and MIME typed files and Blocking Shockpuppets,Vandas,and Violated the Rules

--__Alexis Jhon Gaspar (talk) 07:52, 4 May 2017 (UTC)

Denied. Operator does not appear to understand the basics of operating a bot account. Chenzw  Talk  13:03, 4 May 2017 (UTC)

HammadBot[change source]

  • Contributions
  • Operator: Hammad Saeed
  • Programming language: Pywikipedia
  • Function: Categorize articles,Remove mistakes
  • Description: Remove mistakes from articles and categorize aricles.

-Hammad SaeedAnimalibrí.gif(talk) 05:59, 19 March 2017 (UTC)

  • "Categorize articles" and "remove mistakes" are two separate tasks. Please clarify what exactly this bot will be doing. Chenzw  Talk  06:09, 19 March 2017 (UTC)
You will have to be more specific than that - what exactly do you want to categorise, and what mistakes will the bot be fixing? Chenzw  Talk  06:43, 19 March 2017 (UTC)
Sir, mostly i'll fix islamic articles(may be other) and their related issues like add references and add category to the articles,add citations if needed and correction too.--Hammad SaeedAnimalibrí.gif(talk) 07:04, 19 March 2017 (UTC)
Denied. This is not something that a bot should be doing. Chenzw  Talk  07:20, 19 March 2017 (UTC)

KMLbot[change source]

  • Contributions
  • Operator: Evad37 (enwiki)
  • Programming language: Petscan (including a SPARQL Wikidata query) + AWB
  • Function: Adds {{Attached KML}} to articles which have KML files available through Wikidata
  • Description: Basically the same task as the bot performs on English Wikipedia:
    1. Manually generate a list of articles which have KML files available through Wikidata, using a Petscan query [1]
    2. Use the Petscan output as input for AWB. In AWB:
      Append text {{Attached KML}} to article
      Sort meta data after (so as to add it prior to DEFAULTSORT, interwikis, categories and stub templates)
      Apply genfixes (so that template redirects will be bypassed, and not mistakenly sorted as meta data)
      Skip if page is in use, or page is redirect, or page doesn't exist, or page already has {{Attached KML}} template (matches regex \{\{\s*Attached(\s|_)KML\s*(\||\}\}) )
      20 second delay between saves
    • Proposed to run around once a week. The initial run would be around 230 pages, probably much less on subsequent runs (depeneds on KML creation rate on other wikis).
    • The bot would be not only be exclusion-compliant through {{bots}} (via AWB), but would also be exclusion-compliant through the (yet to be created) {{No KML}} template populating [[Category:Pages which should not use KML from Wikidata]] (hidden tracking category also yet to be created). This allows excluded pages to be filtered out with the PetScan query, would enable tracking of such article through the category, and would encourage editors to provide a reason why the KML shouldn't be used (so that problems could possibly be fixed for all wikis, rather than just ignored at this wiki).
    • The page at User:KMLbot would have info like what is at w:en:User:KMLbot, but adjusted for Simple Wikipedia.
    • Example edits (from enwiki): [2], [3], [4]
    • Approval on enwiki is at w:en:Wikipedia:Bots/Requests for approval/KMLbot - Evad37 (talk) 02:15, 26 October 2016 (UTC)
      • Approved for trial. Could you do a run on our wiki where a few of the proposed changes are made. Wouldn't need a lot, just want to see it in action before I approve. But it shouldn't be a problem. -DJSasso (talk) 13:01, 26 October 2016 (UTC)
        • Thanks DJSasso, can you give the bot Confirmed userright and add it the AWB check page so I can make the trial edits? - Evad37 (talk) 05:55, 27 October 2016 (UTC)
          • What did you need Confirmed for? It isn't a right that 'crats can give on this wiki. Other than editing protected pages, I think you probably are ok for now without being confirmed. You will confirm after 10 edits/4 days anyway. I completely forgot about AWB however, I will do that right now. -DJSasso (talk) 11:09, 27 October 2016 (UTC)
            • I got Confirmed on enwiki [5] as AWB doesn't work unless the account is auto/confirmed. I didn't realise it wasn't available here, but it doesn't matter, I'll just make 10 dummy edits with the bot account. - Evad37 (talk) 13:19, 27 October 2016 (UTC)
            • @Djsasso: Dummy edits done, and 10 trial edits completed too - Evad37 (talk) 13:41, 27 October 2016 (UTC)
              • The edits look good. However, one slight thing I am a bit concerned about. The {{Attached KML}} template links to en.wiki for the subpages as opposed to local pages. We are a separate wiki and try not to do cross wiki pulls for information if possible. As such is this just a bug and you will be creating those subpages on the local template or was it your intention to always have them link to en.wiki? -DJSasso (talk) 14:25, 27 October 2016 (UTC)
                • The intention is that if a KML file doesn't exist locally on this wiki, then the KML file stored on another wiki (English Wikipedia, or 7 others) is used for the links. The subpages aren't really wikipages as such, just containers for the KML files. While Simple English Wikipedia could decide to only use local subpages, and duplicate kml files that exist on other wikis, that's a bit like only allowing local image uploads and not "pulling" any images from Commons. The only reason that subpages are used is that .kml files cannot be directly uploaded to Commons (that feature request is phab:T28059, which has been open since 2010). So while the KML files are pulled from another wiki, I don't think that it's much different to pulling image, video, or sound files from Commons. Plus since the KML files store series of coordinates, they shouldn't need to vary between wikis (but if there is a reason, a local file will take precedence over a file from another wiki). - Evad37 (talk) 03:18, 28 October 2016 (UTC)
                  • @Evad37: Makes sense, I just wanted to make sure there was a good reason. Will it update the local page if at some point there is a local version of the kml, even though as you say that is probably unlikely? Just as an aside, seems like these sorts of things would be the ideal information to be stored at wikidata as opposed to commons? I will add the flag once I hear back on this last question. -DJSasso (talk) 13:01, 31 October 2016 (UTC)

Policy Discussion[change source]

Wikidata Phase 1: Interlanguage links (interwikis)[change source]

The way interwiki links are handled is being changed. Instead of coding them on each individual article, most of them will be maintained in Wikidata. You can read more here. In view of this, what will we need to do about the bots we have that maintain interwikis links? Discontinue them? Convert them to remove links from pages where possible? Discuss. :) --Auntof6 (talk) 14:00, 25 February 2013 (UTC)

Pywikipedia which most of them run on has been updated to avoid wikis which now are part of the wikidata system and there are some bots that will remove the links once they have all been imported to wikidata. Likely this will make a number of the current bots inactive as most won't bother continuing after this change. Eventually I will be removing bots that are inactive per our normal procedure of removing inactive bots. But just to be clear wikidata does not automatically make interwiki bots depricated. I should also note we are simple have not yet been migrated over so bots will still be operating here as normal. -DJSasso (talk) 14:57, 25 February 2013 (UTC)

Request a new bot[change source]

Bot job request -- identify long stub articles[change source]

I often find articles tagged as stubs that are really long enough not to need the tag. Does anyone have a bot that could look at all the pages tagged as stubs, and make a list of the ones that are over a certain length? That way, I (or anyone else interested) could look at the articles to see if the stub tags could be removed. It would be great if the bot could omit tables, infoboxes, and navboxes from the character count, but even including them would be helpful. Thanks. --Auntof6 (talk) 05:32, 31 March 2013 (UTC)

Stub doesn't just mean short. A stub can be quite long and still be a stub. Stub just means an article that is missing something important. AWB of course used to treat it just as a character count but I do believe that functionality was removed because it was of course incorrect. It really does need a human eye to decide if it should be a stub or not. -DJSasso (talk) 12:43, 1 April 2013 (UTC)
I agree. This would just be a starting point. I'm asking for a list, not for a bot to make the changes. --Auntof6 (talk) 13:04, 1 April 2013 (UTC)
Yeah was just clarifying for anyone that might take care of this. -DJSasso (talk) 14:14, 1 April 2013 (UTC)
The easiest way to do this might be to query the replicated database on Toolserver. Since I'm not nearly good at this (though I know someone who is :P), do you happen to have any idea of the minimum size to include? If it turns out that a db query isn't the easiest way, I could try to write something in Python to do this.  Hazard-SJ  ✈  02:24, 18 April 2013 (UTC)
Via the db I got User:Hazard-Bot/Long stubs.  Hazard-SJ  ✈  06:21, 21 April 2013 (UTC)
Thanks! I'll take a look at those. --Auntof6 (talk) 23:51, 25 April 2013 (UTC)

┌─────────────────────────────────┘
I think there are two different concepts involved here: The first is that an article shorter than a given size (say 2k characters) is unlikely to fully treat a subject, if it is not a redirect, or has been proposed for merging. This category is probably easy to find, and can easily be handled by a bot. The other category is that a human editor, ideally an expert in the subject, identifies that an article does not cover the subjects it should, in the necessary detail. For the first case, it would also be possible to write an edit filter that tags the articles. Edit filters are different from bots in that they only trigger when an article is edited, though. Tags have the benefit that they can be filtered for in the recent changes list. They have the drawback though that they won't show the articles that aren't edited. --Eptalon (talk) 07:41, 30 June 2013 (UTC)

I started looking at these back when the list was produced. I found that most of them were only long because they had a lot of tables, infobox data, etc., so I didn't find any (so far) that could be de-stubbed. --Auntof6 (talk) 08:21, 30 June 2013 (UTC)
It's not possible to access the page text from the database, so I'd have to modify the script to load the text separately and try to exclude such things. I'll see what I can do.  Hazard-SJ  ✈  02:11, 3 July 2013 (UTC)
Well, OK, if it's not much trouble. I didn't want to cause a lot of work here. --Auntof6 (talk) 02:40, 3 July 2013 (UTC)
Please take a look at what I have so far (you'll have to sort it by "Length" for now). That length is the number of characters, and it excludes tables using the regular syntax or s-start/s-end templates, as well as any template with "box" or "list" in the title.  Hazard-SJ  ✈  01:03, 10 July 2013 (UTC)
Thanks -- the first few I checked turned up some that I think can be de-stubbed, so I'll work some more with that list in a bit. Thanks for taking the time for this! --Auntof6 (talk) 01:34, 10 July 2013 (UTC)

Bot job request=Add template to pages[change source]

At Template:Chess, there are lots of pages to which the template links to. There are a lot of pages in which the template isn't applied onto. It will take a lit of time doing this. Can I please have a bot to do this? PokestarFan (talk) 22:09, 21 January 2016 (UTC)

Not a good job for an automated bot, but I will do it with AWB. Stand by. --Auntof6 (talk) 22:51, 21 January 2016 (UTC)
OK, done. Let me know if it doesn't look the way you expected. --Auntof6 (talk) 23:02, 21 January 2016 (UTC)

Bot job request-Identify and make lists of pages[change source]

I want a bot or tow (or even three) to generate lists of articles that are about a city in general. What I said is confusing, so breaking it down. Have you noticed that pages such as [[Category:Communes in Aube]] have over 200 articles that are all communes? I want bots to find all(or most) of these sections and make lists located as User:PokestarFan/Bot Requests/Lists. Then I can approve them. After that, I want a bot to make navboxes of these pages and put them in every page. Is this possible? PokestarFan (talk) 03:00, 6 February 2016 (UTC)

Approve? What do you need to approve? Also, I am not sure if it is a good idea to be making even more navboxes. (ping Auntof6) Chenzw  Talk  03:10, 6 February 2016 (UTC)
Yes, please don't go overboard creating navboxes. Especially don't create them for things that have a large number of entries like the category you mention: when you do that, the navbox template takes over the articles it's in. Not everything needs a navbox, and we already have a lot of navboxes that we don't really need. Categories often do the job much better, and we've deleted navboxes in the past for that reason. I'd like to suggest this: if you see a need for a navbox, run the idea past someone before you create it. I'm suggesting this to try to avoid getting your work deleted again, and to keep people from thinking you're continuing to be disruptive.
Besides that, a bot probably can't do what you're asking. --Auntof6 (talk) 03:41, 6 February 2016 (UTC)