Wikipedia talk:Bots

From Wikipedia, the free encyclopedia
Jump to: navigation, search

Requests for the bot flag should be made on this page. This wiki uses the standard bot policy, and allows global bots and automatic approval of certain types of bots. Other bots should apply below. Global bot flag requests are handled at Meta:Steward requests.

If you are requesting bot status, please use the Current requests section. For other matters dealing with bots, please use the "discussion section".

Current requests[change source]

Put new requests at the top. Look here for a form to use to request bot status.

KMLbot[change source]

  • Contributions
  • Operator: Evad37 (enwiki)
  • Programming language: Petscan (including a SPARQL Wikidata query) + AWB
  • Function: Adds {{Attached KML}} to articles which have KML files available through Wikidata
  • Description: Basically the same task as the bot performs on English Wikipedia:
    1. Manually generate a list of articles which have KML files available through Wikidata, using a Petscan query [1]
    2. Use the Petscan output as input for AWB. In AWB:
      Append text {{Attached KML}} to article
      Sort meta data after (so as to add it prior to DEFAULTSORT, interwikis, categories and stub templates)
      Apply genfixes (so that template redirects will be bypassed, and not mistakenly sorted as meta data)
      Skip if page is in use, or page is redirect, or page doesn't exist, or page already has {{Attached KML}} template (matches regex \{\{\s*Attached(\s|_)KML\s*(\||\}\}) )
      20 second delay between saves
    • Proposed to run around once a week. The initial run would be around 230 pages, probably much less on subsequent runs (depeneds on KML creation rate on other wikis).
    • The bot would be not only be exclusion-compliant through {{bots}} (via AWB), but would also be exclusion-compliant through the (yet to be created) {{No KML}} template populating [[Category:Pages which should not use KML from Wikidata]] (hidden tracking category also yet to be created). This allows excluded pages to be filtered out with the PetScan query, would enable tracking of such article through the category, and would encourage editors to provide a reason why the KML shouldn't be used (so that problems could possibly be fixed for all wikis, rather than just ignored at this wiki).
    • The page at User:KMLbot would have info like what is at w:en:User:KMLbot, but adjusted for Simple Wikipedia.
    • Example edits (from enwiki): [2], [3], [4]
    • Approval on enwiki is at w:en:Wikipedia:Bots/Requests for approval/KMLbot - Evad37 (talk) 02:15, 26 October 2016 (UTC)
      • Approved for trial. Could you do a run on our wiki where a few of the proposed changes are made. Wouldn't need a lot, just want to see it in action before I approve. But it shouldn't be a problem. -DJSasso (talk) 13:01, 26 October 2016 (UTC)
        • Thanks DJSasso, can you give the bot Confirmed userright and add it the AWB check page so I can make the trial edits? - Evad37 (talk) 05:55, 27 October 2016 (UTC)
          • What did you need Confirmed for? It isn't a right that 'crats can give on this wiki. Other than editing protected pages, I think you probably are ok for now without being confirmed. You will confirm after 10 edits/4 days anyway. I completely forgot about AWB however, I will do that right now. -DJSasso (talk) 11:09, 27 October 2016 (UTC)
            • I got Confirmed on enwiki [5] as AWB doesn't work unless the account is auto/confirmed. I didn't realise it wasn't available here, but it doesn't matter, I'll just make 10 dummy edits with the bot account. - Evad37 (talk) 13:19, 27 October 2016 (UTC)
            • @Djsasso: Dummy edits done, and 10 trial edits completed too - Evad37 (talk) 13:41, 27 October 2016 (UTC)
              • The edits look good. However, one slight thing I am a bit concerned about. The {{Attached KML}} template links to for the subpages as opposed to local pages. We are a separate wiki and try not to do cross wiki pulls for information if possible. As such is this just a bug and you will be creating those subpages on the local template or was it your intention to always have them link to -DJSasso (talk) 14:25, 27 October 2016 (UTC)
                • The intention is that if a KML file doesn't exist locally on this wiki, then the KML file stored on another wiki (English Wikipedia, or 7 others) is used for the links. The subpages aren't really wikipages as such, just containers for the KML files. While Simple English Wikipedia could decide to only use local subpages, and duplicate kml files that exist on other wikis, that's a bit like only allowing local image uploads and not "pulling" any images from Commons. The only reason that subpages are used is that .kml files cannot be directly uploaded to Commons (that feature request is phab:T28059, which has been open since 2010). So while the KML files are pulled from another wiki, I don't think that it's much different to pulling image, video, or sound files from Commons. Plus since the KML files store series of coordinates, they shouldn't need to vary between wikis (but if there is a reason, a local file will take precedence over a file from another wiki). - Evad37 (talk) 03:18, 28 October 2016 (UTC)
                  • @Evad37: Makes sense, I just wanted to make sure there was a good reason. Will it update the local page if at some point there is a local version of the kml, even though as you say that is probably unlikely? Just as an aside, seems like these sorts of things would be the ideal information to be stored at wikidata as opposed to commons? I will add the flag once I hear back on this last question. -DJSasso (talk) 13:01, 31 October 2016 (UTC)

Macdonald-rossBot[change source]

Lenovo09Bot[change source]

  • Contributions
  • Operator: Lenovo (may be on another wikipedia like Lenovo)
  • Programming language: Pywikipedia, php, C++, etc.
  • Function: interwiki, regex, etc.
  • Description: Your flag groups. cleaning sandbox. Lenovo09Bot (talk) 06:18, 26 November 2015 (UTC)
    • Denied. I will block future accounts on sight if you continue doing this. Chenzw  Talk  07:46, 26 November 2015 (UTC)

LenovoBot[change source]

  • Contributions
  • Operator: Lenovo (may be on another wikipedia like Lenovo)
  • Programming language: Pywikibot
  • Function: Pywikibot
  • Description: You flag group. cleaning sandbox. LenovoBot (talk) 14:06, 25 November 2015 (UTC)
    •  Comment. There is a SUL-related problem with your own contributor (non-bot) account. I believe more information is available on your own user talk page at EN. Until then, we cannot confirm that User:Lenovo is the operator, because this request was made by the bot account. Also, User:Hazard-Bot is already doing the sandbox cleaning task. Chenzw  Talk  14:16, 25 November 2015 (UTC)

Flag get. LenovoBot (talk) 14:18, 25 November 2015 (UTC)

Denied. / 対処せず終了: You don't seem to understand English well and we have no confirmation who the operator is (User:Lenovo does not exist on this wiki). Furthermore, your bot has already made a mistake on Wikidata, which if I remember correctly, is not possible with the sandbox script in pywikibot. Chenzw  Talk  14:43, 25 November 2015 (UTC)
Again done. The only global bots. (talk) 16:58, 25 November 2015 (UTC)
No. What is going on here shows that you are obviously incompetent in operating a bot responsibly on this wiki. Chenzw  Talk  23:46, 25 November 2015 (UTC)
Yes, again. Arturfrom (talk) 05:06, 26 November 2015 (UTC)

Tulsibot 2[change source]

  • Contributions: Tulsibot
  • Operator: Tulsi Bhagat
  • Programming Language: Pywikibot
  • Function: cosmetic changes
  • Description: Have a look! to my contributions page here that Tulsibot will do.

Functions Review[change source]

  1. Replaces underlines by spaces, also multiple underlines,
  2. Removes unnecessary leading spaces from a title,
  3. Removes unnecessary trailing spaces from a title,
  4. Converts URL-encoded characters to unicode,
  5. Removes unnecessary initial and final spaces from a label,
  6. Tries to capitalize the first letter of the title,
  7. Remove useless spaces, etc.

I promise, it really works without problems and it will be helpful for the wikipedia. Thank you -- 30px Tulsi Bhagat (Talk) 09:13, 2 July 2015 (UTC)

Tulsi, the proper procedure is to ask for permission before using the bot account. You have been using it without that permission. Because of that, I have blocked the account. You can use your non-bot account to discuss this. --Auntof6 (talk) 09:50, 2 July 2015 (UTC)

Denied. Generally we don't allow bots for just trivial minor changes. If you want a bot here I suggest becoming a regular editor here so you can understand how this community works before asking to run a bot here which is generally only given to very trusted users. -DJSasso (talk) 18:14, 2 July 2015 (UTC)

Tulsibot[change source]

  • Contributions
  • Operator: Tulsi Bhagat
  • Programming Language: Pywikibot
  • Function: fixing redirects and adding commonscat to the articles.
  • Description: Have a look! to my contributions page here that Tulsibot will do. Fixing redirects and adding the template commonscat to the articles. 30px Tulsi Bhagat (Talk) 07:26, 1 July 2015 (UTC)

Non-crat comment: please stop doing these functions. Redirects do not need to be "fixed": sometimes they exist because an article hasn't been created yet, and if the article is created in the future we won't have to change anything. Some of us actively change links to point to redirects on purpose for that reason. As for the Commons categories, in at least one case (1) you added a Commons category that is not a match. Commons tags should be added only if there is an exact match. I am not a 'crat, but I oppose giving the bot flag for these functions. --Auntof6 (talk) 09:05, 1 July 2015 (UTC)

Denied. Auntof6 has pretty much summed up what I was going to say. See WP:NOTBROKEN for why you should not be doing this to redirects. -DJSasso (talk) 19:56, 1 July 2015 (UTC)

AbiBot[change source]

  • Contributions
  • Operator: Abigor
  • Programming Language: PHP
  • Function: Welcome
  • Description: I noticed that a lot of users with 10+ edits still didn't get a welcome template. I would like to do a test run for this community for welcoming new users with 10 edits and no reverted edits. This will prevent the bot from welcoming vandals. Abigor (talk) 07:22, 23 February 2015 (UTC)

I am not a 'crat, but if you read the earlier bot requests farther down on this page, you will see that similar requests have been denied because we did not want automated welcoming. --Auntof6 (talk) 07:34, 23 February 2015 (UTC)

Correct, I saw that request, and this bot isn't going to welcome everybody like that bot would do. This bot will welcome people with 10 edits that are non-reverted. So big difference is that where the other bot would welcome vandals also, this bot will only welcome people that do good work. Or good vandals that didn't get caught. I can even run it with a 24 hour delay, to give the "humans" 24 hours time to welcome. Abigor (talk) 07:54, 23 February 2015 (UTC)
Denied. Nope bots welcoming users is completely rejected by the community because it is impersonal and is effectively spam. Not everyone needs to be welcomed immediately, or ever to be honest. -DJSasso (talk) 14:50, 23 February 2015 (UTC)
And to be honest, even if we were going to give that task to someone it would not likely to be to someone who only edited here for the first time in 5 years today and wasn't an active edit even prior to that. Sort of silly to have someone not active here being the person welcoming others. -DJSasso (talk) 14:57, 23 February 2015 (UTC)

BarrasBot[change source]

  • Contributions
  • Operator: Barras
  • Programming language: Pywikipedia
  • Function: Archive bot
  • Description: Archiving talk pages etc whenever needed

As mentioned on WP:AN MiszaBot is actually down and I could probably take over the job. For the start the bot will only run when I can can watch it. I might create a cronjob for it later when it really works without problems. Trial approval or something would be good. -Barras talk 00:49, 19 January 2015 (UTC)

  • Since the page hasn't been edited in ages, I don't know if it is still watched by fellow crats (would love to approve it myself ;-)), but I guess I better ping my fellow crats: @Eptalon, Djsasso, Chenzw, Pmlineditor: -Barras talk 00:56, 19 January 2015 (UTC)
    Approved for trial. Not too many pages, so around 10-15 edits should be enough before flagging I guess. Pmlineditor (t · c · l) 09:59, 19 January 2015 (UTC)
    Pictogram voting wait.svg Doing... - Please note that I granted the bot the "confirmed" flag via meta as it is not yet autoconfirmed and got a captcha. Will remove that flag later. -Barras talk 11:56, 19 January 2015 (UTC)
    Trial complete. - Looks like it works as it should. -Barras talk 12:25, 19 January 2015 (UTC)
     Approved. and flagged. Pmlineditor (t · c · l) 12:39, 19 January 2015 (UTC)
    Thanks! I've now also removed the confirmed flag from the bot as everything needed is in the bot right included. -Barras talk 13:07, 19 January 2015 (UTC)
    Cool, I had been doing it with my bot but I hadn't gotten around to setting up a cronjob to keep doing it at a specific time so it was a bit haphazard so good work setting it up. -DJSasso (talk) 14:19, 19 January 2015 (UTC)
    That is exactly what I will be doing for the next time for two reasons: To be able to watch the bot and because I simply don't get the cron job working :-) However, the code is not altered, so it should work without any problems. -Barras talk 15:43, 19 January 2015 (UTC)

Policy Discussion[change source]

Wikidata Phase 1: Interlanguage links (interwikis)[change source]

The way interwiki links are handled is being changed. Instead of coding them on each individual article, most of them will be maintained in Wikidata. You can read more here. In view of this, what will we need to do about the bots we have that maintain interwikis links? Discontinue them? Convert them to remove links from pages where possible? Discuss. :) --Auntof6 (talk) 14:00, 25 February 2013 (UTC)

Pywikipedia which most of them run on has been updated to avoid wikis which now are part of the wikidata system and there are some bots that will remove the links once they have all been imported to wikidata. Likely this will make a number of the current bots inactive as most won't bother continuing after this change. Eventually I will be removing bots that are inactive per our normal procedure of removing inactive bots. But just to be clear wikidata does not automatically make interwiki bots depricated. I should also note we are simple have not yet been migrated over so bots will still be operating here as normal. -DJSasso (talk) 14:57, 25 February 2013 (UTC)

Request a new bot[change source]

Bot job request -- identify long stub articles[change source]

I often find articles tagged as stubs that are really long enough not to need the tag. Does anyone have a bot that could look at all the pages tagged as stubs, and make a list of the ones that are over a certain length? That way, I (or anyone else interested) could look at the articles to see if the stub tags could be removed. It would be great if the bot could omit tables, infoboxes, and navboxes from the character count, but even including them would be helpful. Thanks. --Auntof6 (talk) 05:32, 31 March 2013 (UTC)

Stub doesn't just mean short. A stub can be quite long and still be a stub. Stub just means an article that is missing something important. AWB of course used to treat it just as a character count but I do believe that functionality was removed because it was of course incorrect. It really does need a human eye to decide if it should be a stub or not. -DJSasso (talk) 12:43, 1 April 2013 (UTC)
I agree. This would just be a starting point. I'm asking for a list, not for a bot to make the changes. --Auntof6 (talk) 13:04, 1 April 2013 (UTC)
Yeah was just clarifying for anyone that might take care of this. -DJSasso (talk) 14:14, 1 April 2013 (UTC)
The easiest way to do this might be to query the replicated database on Toolserver. Since I'm not nearly good at this (though I know someone who is :P), do you happen to have any idea of the minimum size to include? If it turns out that a db query isn't the easiest way, I could try to write something in Python to do this.  Hazard-SJ  ✈  02:24, 18 April 2013 (UTC)
Via the db I got User:Hazard-Bot/Long stubs.  Hazard-SJ  ✈  06:21, 21 April 2013 (UTC)
Thanks! I'll take a look at those. --Auntof6 (talk) 23:51, 25 April 2013 (UTC)

I think there are two different concepts involved here: The first is that an article shorter than a given size (say 2k characters) is unlikely to fully treat a subject, if it is not a redirect, or has been proposed for merging. This category is probably easy to find, and can easily be handled by a bot. The other category is that a human editor, ideally an expert in the subject, identifies that an article does not cover the subjects it should, in the necessary detail. For the first case, it would also be possible to write an edit filter that tags the articles. Edit filters are different from bots in that they only trigger when an article is edited, though. Tags have the benefit that they can be filtered for in the recent changes list. They have the drawback though that they won't show the articles that aren't edited. --Eptalon (talk) 07:41, 30 June 2013 (UTC)

I started looking at these back when the list was produced. I found that most of them were only long because they had a lot of tables, infobox data, etc., so I didn't find any (so far) that could be de-stubbed. --Auntof6 (talk) 08:21, 30 June 2013 (UTC)
It's not possible to access the page text from the database, so I'd have to modify the script to load the text separately and try to exclude such things. I'll see what I can do.  Hazard-SJ  ✈  02:11, 3 July 2013 (UTC)
Well, OK, if it's not much trouble. I didn't want to cause a lot of work here. --Auntof6 (talk) 02:40, 3 July 2013 (UTC)
Please take a look at what I have so far (you'll have to sort it by "Length" for now). That length is the number of characters, and it excludes tables using the regular syntax or s-start/s-end templates, as well as any template with "box" or "list" in the title.  Hazard-SJ  ✈  01:03, 10 July 2013 (UTC)
Thanks -- the first few I checked turned up some that I think can be de-stubbed, so I'll work some more with that list in a bit. Thanks for taking the time for this! --Auntof6 (talk) 01:34, 10 July 2013 (UTC)

Bot job request=Add template to pages[change source]

At Template:Chess, there are lots of pages to which the template links to. There are a lot of pages in which the template isn't applied onto. It will take a lit of time doing this. Can I please have a bot to do this? PokestarFan (talk) 22:09, 21 January 2016 (UTC)

Not a good job for an automated bot, but I will do it with AWB. Stand by. --Auntof6 (talk) 22:51, 21 January 2016 (UTC)
OK, done. Let me know if it doesn't look the way you expected. --Auntof6 (talk) 23:02, 21 January 2016 (UTC)

Bot job request-Identify and make lists of pages[change source]

I want a bot or tow (or even three) to generate lists of articles that are about a city in general. What I said is confusing, so breaking it down. Have you noticed that pages such as [[Category:Communes in Aube]] have over 200 articles that are all communes? I want bots to find all(or most) of these sections and make lists located as User:PokestarFan/Bot Requests/Lists. Then I can approve them. After that, I want a bot to make navboxes of these pages and put them in every page. Is this possible? PokestarFan (talk) 03:00, 6 February 2016 (UTC)

Approve? What do you need to approve? Also, I am not sure if it is a good idea to be making even more navboxes. (ping Auntof6) Chenzw  Talk  03:10, 6 February 2016 (UTC)
Yes, please don't go overboard creating navboxes. Especially don't create them for things that have a large number of entries like the category you mention: when you do that, the navbox template takes over the articles it's in. Not everything needs a navbox, and we already have a lot of navboxes that we don't really need. Categories often do the job much better, and we've deleted navboxes in the past for that reason. I'd like to suggest this: if you see a need for a navbox, run the idea past someone before you create it. I'm suggesting this to try to avoid getting your work deleted again, and to keep people from thinking you're continuing to be disruptive.
Besides that, a bot probably can't do what you're asking. --Auntof6 (talk) 03:41, 6 February 2016 (UTC)