Wikipedia:Bots/Fulfilled requests/2020

From Simple English Wikipedia, the free encyclopedia

HousekeepingBot[change source]

  • Operator: FNAFPUPPETMASTER
  • Programming language: Pywikibot
  • Function: Tagging
  • Description: The bot would tag new page protections, and remove them when it expired. It would be manually run every Thursday.

--rollingbarrels (talk) 01:09, 12 January 2020 (UTC)[reply]

Which pywikibot script will you be using for this task? Chenzw  Talk  02:11, 12 January 2020 (UTC)[reply]
@Chenzw: blockpageschecker, addtext, & listpages. rollingbarrels (talk) 02:22, 12 January 2020 (UTC)[reply]
@FNAFPUPPETMASTER: Does "manually run every Thursday" mean it would only update once a week? Pages can be protected and then unprotected both in less than a week. Computer Fizz (talk) 03:25, 12 January 2020 (UTC)[reply]
@Computer Fizz: Of course, but once I get it selfhosted, It'll be fully automatic(running once an hour) rollingbarrels (talk) 03:29, 12 January 2020 (UTC)[reply]
@FNAFPUPPETMASTER: Asesome, once an hour definitely sounds better than once a week. Is there a reason you can't just use EventSource to make it update immediately though? Computer Fizz (talk) 03:30, 12 January 2020 (UTC)[reply]
@Computer Fizz: I'd rather start with hourly updates in case it has a lot of false positives for the first 3 weeks. rollingbarrels (talk) 03:57, 12 January 2020 (UTC)[reply]
False positives really only happen with machine learning, I don't see how it would fail to recognize whether or not a page is protected Computer Fizz (talk) 03:58, 12 January 2020 (UTC)[reply]

┌─────────────────────────────────┘
 (change conflict)  False positives? Why would such a task even result in false positives? Chenzw  Talk  03:59, 12 January 2020 (UTC)[reply]

@Computer Fizz: @Chenzw: By that I mean when a page protection status is changed (level or duration) within one hour of it happening. rollingbarrels (talk) 04:06, 12 January 2020 (UTC)[reply]
That is not a false positive. That just means the protection template tagging is outdated, which is supposedly what your bot is aiming to resolve. Chenzw  Talk  04:08, 12 January 2020 (UTC)[reply]
@Chenzw:For the first little while while I'm selfhosting, I'm planning to make a log that both acts as a failsafe in case I make a breaking change and it loses its current "monitoring" of a page, and as a double checker to make sure that the page has the right protection and the right duration. It will update every hour because, like I said earlier, I want it to only have the latest tag, not both tags. rollingbarrels (talk) 04:17, 12 January 2020 (UTC)[reply]
I am not following. The blockpageschecker script corrects and removes protection templates from pages as necessary, so I don't see how you can end up with more than one protection template on an article if you run the bot. Chenzw  Talk  05:04, 12 January 2020 (UTC)[reply]
@Chenzw:The tag would only be double in the logs, the front-end changes aren't affected. rollingbarrels (talk) 06:02, 12 January 2020 (UTC)[reply]
Why would there even be a double entry in the logs? Chenzw  Talk  06:04, 12 January 2020 (UTC)[reply]
The bot's own logs, not Special:Logs rollingbarrels (talk) 06:06, 12 January 2020 (UTC)[reply]

┌─────────────────────────────────┘
I am quite aware that you are talking about your bot's own logs, but why would there even be duplicate entries in the logs? Chenzw  Talk  06:09, 12 January 2020 (UTC)[reply]

@Chenzw: Ah, sorry, I just figured out a fix. Instead of looking Special:ProtectedPages for pages that it hadn't seen yet with a certain protection, it could monitor Special:Logs for protection via python's sseclient. rollingbarrels (talk) 06:17, 12 January 2020 (UTC)[reply]
Doesn't the existance of this bot mean we'll have to grant editprotected to the bots group? Perhaps it would be better if this bot was run by someone who's already an admin on their main account. Computer Fizz (talk) 07:37, 12 January 2020 (UTC)[reply]
@Computer Fizz: If I need to, I can send a message reminding the blocking admin to place a {{pp}} when they fully protect it and send it again when it would expire, if you aren't comfortable with me having a tool that has the editprotected flag. However, admins can block and mass rollback this and other bots just like a normal user. rollingbarrels (talk) 07:45, 12 January 2020 (UTC)[reply]
I'm also willing to put this request on hold until I get more experience (possibly: 2-3 weeks) rollingbarrels (talk) 08:00, 12 January 2020 (UTC)[reply]
What I think should happen here is that an already existing bot is sysopped and then starts adding the protection notices. Most likely one owned by an admin. Computer Fizz (talk) 08:32, 12 January 2020 (UTC)[reply]
I am not particularly keen on granting approval for this task. There are currently 12 articles with a non-indef protection period, which is a minuscule fraction of our total articles. Furthermore, protection templates are not routinely used by administrators - most of the time it is left to MediaWiki:Protectedpagetext to inform prospective editors of page protection. Chenzw  Talk  09:24, 12 January 2020 (UTC)[reply]
That's the purpose of the bot, to do small, unnecessary but still useful tasks. rollingbarrels (talk) 09:29, 12 January 2020 (UTC)[reply]
Denied. I have been thinking this over for a few hours. But I am not comfortable giving the bot flag to an editor that is as new to this wiki as you are. Especially considering your past on en.wiki. Also as Chenzw points out, we don't really use the protection templates here. However, we can probably add this functionality to an existing bot if it is desired. -DJSasso (talk) 01:01, 13 January 2020 (UTC)[reply]

┌──────────────────────────┘
As I mentioned above, any bot that does this will most likely need to be controlled by an admin since it'll need to edit fully protected pages. I don't think it really matters other than that as long as some bot is adding them (i don't see why the software doesn't add it, but oh well this is the next best thing...) Computer Fizz (talk) 01:26, 13 January 2020 (UTC)[reply]

Probably doesn't add it automatically because it uses MediaWiki:Protectedpagetext to indicate protection. The lock icons are really just "beautification" which would be up to anyone using mediawiki to decide if they want to do or not on their own wiki. But yes you are correct it would need to be an admin bot to edit fully protected pages which would mean it would have to be an admin running it. -DJSasso (talk) 01:33, 13 January 2020 (UTC)[reply]
@DJSasso: It still may be a good idea to have the lock notices. Obviously it would be tedious for a human to do this, especially with short protections that only last one or two hours, but I don't see a problem with having a bot do it, as long as the owner can be trusted to edit fully-protected pages. Computer Fizz (talk) 01:37, 13 January 2020 (UTC)[reply]
The bot would have access to anything an admin can do. So it wouldn't just be trusting them to edit fully protected pages but with everything an admin has access to. But yes. I am looking right now at the scripts to see if its worth me setting it up here on my bot which already runs nightly. However would need community approval to flag the bot as admin. So it may not be worth the effort. Will look into what those scripts entail first. -DJSasso (talk) 01:41, 13 January 2020 (UTC)[reply]

InternetArchiveBot[change source]

  • Contributions
  • Operator: Cyberpower678
  • Programming language: PHP
  • Function: Fix dead links, blue book refs
  • Description: This bot will scan Wikipedia looking for dead links. Dead links will be supplemented, or replaced, with archive URLs. A feature currently in the works, the bot will look for unlinked book references and a link to a preview of the book, or page if mentioned in the reference.

CYBERPOWER (Chat) 19:54, 11 September 2020 (UTC)[reply]

Approved for trial (50 edits). --Chenzw  Talk  16:50, 12 September 2020 (UTC)[reply]
Chenzw, The bot is running into captcha requests.—CYBERPOWER (Chat) 19:45, 12 September 2020 (UTC)[reply]
Trial complete.—CYBERPOWER (Chat) 00:50, 13 September 2020 (UTC)[reply]
 Approved. Apologies, completely slipped my mind that the edits were going to trigger the captcha. Flag granted. Chenzw  Talk  11:44, 13 September 2020 (UTC)[reply]

DaedanBot[change source]

— Gomdoli (talk) 01:03, 31 August 2020 (UTC)[reply]

Is this "one-time" or continuous, automatic, constant check and retarget for any category that has been moved? Naleksuh (talk) 01:19, 31 August 2020 (UTC)[reply]
It's not automatic, but I'm planning to run when there's a new redirect category. — Gomdoli (talk) 01:29, 31 August 2020 (UTC)[reply]
In that case you may be looking for WT:AWB/CP. For the african-american pages though , I would say there's few enough that it could be done manually. Naleksuh (talk) 01:31, 31 August 2020 (UTC)[reply]
Thank you for explaining. I will use this bot account. — Gomdoli (talk) 02:53, 31 August 2020 (UTC)[reply]
There is already a bot that does this automatically without needing to be run manually. If you are talking about a specific category that isn't cleared yet it is because any new category moves it has a cool down period to wait 7 days after a move to do it in case the move is reverted, which is the case of with the African Americans pages. This is to prevent unnecessary edits moving the pages back and forth between categories. -Djsasso (talk) 11:05, 31 August 2020 (UTC)[reply]
You can see its log of work here. I am inclined to decline this request as not being necessary, and since generally we don't hand out AWB just to be run on an adhoc basis on this wiki. -Djsasso (talk) 11:20, 31 August 2020 (UTC)[reply]
This is not a one-time plan. It's not automatic, but I try to run steadily. — Gomdoli (talk) 23:47, 31 August 2020 (UTC)[reply]
Yes, I am aware, my point was we have a bot that already does it automatically (it runs every day, so articles never sit in a redirected category more than a day unless its a recently moved category where they wait 7 days), so we don't need someone who does it manually. -Djsasso (talk) 11:16, 1 September 2020 (UTC)[reply]
@Auntof6: — Gomdoli (talk) 00:22, 1 September 2020 (UTC)[reply]

 Not done I see no benefit in duplicating a process that's already done by an automated bot. I'm also reluctant to give bot permission to a user who has been editing on this wiki for only a little more than two weeks. --Auntof6 (talk) 00:41, 1 September 2020 (UTC)[reply]

I'll request when I need it next time. — Gomdoli (talk) 00:45, 1 September 2020 (UTC)[reply]
@Auntof6: Is it not the case that only bureaucrats can review these requests, or does that only apply to approvals? --IWI (talk) 00:50, 1 September 2020 (UTC)[reply]
Good question. I forgot that it takes a crat to grant that right. I'll strike my response and let a crat respond. However, I'll leave the account blocked. --Auntof6 (talk) 03:36, 1 September 2020 (UTC)[reply]
Yes it can only be a crat. I guess I should make my earlier comment more clear with a template  Not done. -Djsasso (talk) 11:16, 1 September 2020 (UTC)[reply]

ChenzwBot[change source]

  • Contributions
  • Operator: ChenzwBot
  • Programming language: Python (custom library)
  • Function: Maintenance of Wikipedia:Good articles/by date
  • Description: This proposed new task involves automatic population/removal of entries from Wikipedia:Good articles/by date - the current manual workflow involves very tedious renumbering of items in the table, with the process becoming even more painful if the oldest GAs are being demoted. I don't anticipate needing a bot flag for this, and the bot will mark such edits as non-bot explicitly. The code hasn't been written yet, but this is not expected to take long because the MediaWiki API functionality is already in a library that was created for the bot's anti-vandalism work (so that's half the code already done), and requesting for "in-principle approval" first before writing of the remaining bot code starts, just in case there is a serious objection for this particular bot task.

--Chenzw  Talk  16:40, 29 August 2020 (UTC)[reply]

When does this bot decide to edit the table? Does it watch the status of GA tags themselves, or is it semi-automated/operator input? Naleksuh (talk) 23:36, 29 August 2020 (UTC)[reply]
The current intention is to run the task on a schedule. The task will check the category members and compare it against the GA list. Chenzw  Talk  03:22, 31 August 2020 (UTC)[reply]
Yep go for it. -Djsasso (talk) 11:08, 31 August 2020 (UTC)[reply]

Brantmeierz-bot[change source]

  • Contributions
  • Operator: Brantmeierz
  • Programming language: Java (using a slightly modified version of https://github.com/MER-C/wiki-java)
  • Function: Redirect management
  • Description: I've been spending a lot of time manually creating and categorizing redirects: many of which follow rules that could be easily identified and performed automatically. In addition, something that I haven't been adding while manually working on redirects is Template:Rcat shell which helps a lot with the clarity of the pages and grouping those with multiple redirect reasons. My bot would perform these behind-the-scenes tasks only on redirect pages.

--Brantmeierz (talk) 03:03, 22 January 2020 (UTC)[reply]

Some I had in mind to apply, and their logic, were:
  • Template:R from other capitalization; guaranteed to identify (if the source and destination page titles were different, but equal when both forced lowercase they must differ in capitalization)
  • Template:R to section; guaranteed to identify (check if destination is linked to a section/contains #)
  • Template:R from specific geographic name; fairly easy to identify (look for redirects where destination page is a location and the page title is {Destination}, {Other identifiers})
  • Template:R from acronym; wouldn't identify all of them, but a conservative method would be finding redirect pages where the destination page consists only of words beginning with each consecutive letter of the source page's name
Others like R from other spelling, plurals, etc. would definitely be possible to identify using a dictionary file or APIs but since that has a much greater ability to produce erroneous results it's not something I would attempt with a bot yet. Brantmeierz (talk) 16:02, 22 January 2020 (UTC)[reply]
Personally I don't think redirects should be categorized on simple at all, regardless of whether it's by a bot or a human, just doesn't seem the simple way to me. If it is done though, I guess it won't really matter which route is done. Computer Fizz (talk) 08:47, 23 January 2020 (UTC)[reply]
Since it’s mostly behind the scenes, more like maintenance categories and not related to page content, I don’t see why the practices should differ much between enwiki and simple (except for some regrouping/renaming; the categorization system here is a lot less technical than enwiki). For the parts where it does affect readers (such as if they happen to stumble across the redirect pages themselves, which will definitely happen sometimes) it helps with readability and navigation to have what’s going on be explained instead of them only being presented with the technical wiki markup for a redirect. Brantmeierz (talk) 16:14, 23 January 2020 (UTC)[reply]
@Computer Fizz: From a management side, ignoring the user perspective, it’s nice to have the pages bound to something, since in a lot of cases the only links to them will be found on the “What links here” section of any given page. Having them categorized with similar pages helps make them more searchable than if they were floating around without any page links, and is especially helpful for restructuring or renaming things since they can be identified as a batch with similar properties. Brantmeierz (talk) 16:18, 23 January 2020 (UTC)[reply]
Yeah our category system is intended to be a lot more simple than the one on en.wiki. Having a complex category system can end up being unsimple. I do understand where you are coming from so you aren't out in left field or anything. Personally I don't care one way or the other if we categorize redirects. I just know I am not likely to do it myself because I doubt anyone would ever really look at the categories of redirects. -DJSasso (talk) 16:33, 23 January 2020 (UTC)[reply]
  • Approved for trial (100 edits). Ok I will approve a trial for only the 4 tasks above and Rcat shell. I would like to see 100 edits with some edits of each kind. Should you want to add other tasks in the future you would need to come back here for additional approval. -DJSasso (talk) 16:33, 23 January 2020 (UTC)[reply]
Isn't there an abuse filter called "redirect page with extra text"? If we're categorizing them now, should that be deleted? Computer Fizz (talk) 17:07, 23 January 2020 (UTC)[reply]
Filter 36, but it can easily be changed to include the bot group as well. rollingbarrels (talk) 17:10, 23 January 2020 (UTC)[reply]
We have been categorizing them for a long time. Just no one has wanted to do it to the scale he is suggesting. It isn't new. But yes, I will likely add the bot group. (although probably not needed because the bot would be autoconfirmed) -DJSasso (talk) 17:11, 23 January 2020 (UTC)[reply]
Abuse filter 36 was designed to tag edits where an unsuspecting user adds content to a redirect, without realising that it would not be visible. If you were to look at the filter rules you will see that all autoconfirmed users will not trip the filter. And as for non-autoconfirmed users causing a false positive if they attempt to categorise redirects, I think the chance of that happening is very low, and it is far more likely that non-autoconfirmed users are (mistakenly) trying to add content in a redirect page. Chenzw  Talk  13:36, 24 January 2020 (UTC)[reply]
Yep that is why I said not needed because it would be autoconfirmed. And you are right very unlikely a non-autoconfirmed user would be categorizing articles legitimately. And it only tags an article, doesn't prevent it anyway. -DJSasso (talk) 13:41, 24 January 2020 (UTC)[reply]
┌───┘
@Brantmeierz: Has not been active since May, so shall this trial be completed as inactive? rollingbarrels (talk) 17:32, 14 December 2020 (UTC)[reply]

User:DYKUpdater[change source]

  • Contributions
  • Operator: FNAFPUPPETMASTER
  • Programming language: Pywikibot
  • Function: Simplifying / Automating the DYK panel's on the main page, archiving recent ones, and clearing the replaced ones.
  • Description: This bot would making the updating process of the DYK simpler, by running every week when it needs to be replaced — Preceding unsigned comment added by FNAFPUPPETMASTER (talkcontribs)
Two issues here: 1) have you already written the bot code for this task? 2) there is currently no editor activity on DYK. Until activity picks that up, a bot is effectively useless. Chenzw  Talk  07:17, 27 February 2020 (UTC)[reply]
Not only that but the DYK system is automatic on its own already with a queue system. -DJSasso (talk) 11:38, 27 February 2020 (UTC)[reply]
Then perhaps just moving this into a user script? rollingbarrels (talk) 15:43, 27 February 2020 (UTC)[reply]
Withdrawn by operator. rollingbarrels (talk) 17:12, 14 December 2020 (UTC)[reply]

Bot Request[change source]

Bot for blocking open proxies (APIs and examples can be provided, but I am not allowed to run such a bot, that would require admin flags). Naleksuh (talk) 03:25, 24 July 2020 (UTC)[reply]

This would need to be evaluated very carefully, and would require broader community approval due to automated admin actions. How would this bot work, and what is the codebase? Chenzw  Talk  10:21, 24 July 2020 (UTC)[reply]
Finding IPs that edit from eventsource and run them through the same checks the Toolforge tool uses (i.e. https://ipcheck.toolforge.org/index.php?ip=185.220.101.27 ). There currently isn't a "codebase" as 1) I am not allowed to operate bots that block without RfA'ing 2) This is "request" a new bot not "create" a new bot. That said, I do have experience with similiar bots in the past Special:PermanentLink/7044544#Second_Flood_Request and if this task is approved it can be created by me (likely in node). Naleksuh (talk) 19:58, 24 July 2020 (UTC)[reply]
So what you'd like is a new bot function rather than a whole new bot? If an existing bot could perform the function, would that suffice? --Auntof6 (talk) 22:27, 24 July 2020 (UTC)[reply]
@Auntof6: I'm not really sure what the difference would be, as no bots currently do anything relating to proxies. If you mean, does there need to be a whole new account, not necessarily, lots of our bots do multiple tasks.
I have been asked by multiple people for code so far (chenzw asked what the "codebase" was right above, oper asked me on irc to write code for this bot that he can review). I have been under the impression that I would not be allowed to run a bot that blocks, but since multiple admins have asked me to create it, that may not be the case. Let me know. I am willing to help any admins interested in creating such a bot (or "bot function" as Auntof6 says). Naleksuh (talk) 23:28, 24 July 2020 (UTC)[reply]
I am not sure what the confusion is. For the benefit of the wider community, in my previous conversation with Naleksuh on Discord, I stated that since examples could be provided (from above), the community would probably want to see actual bot code before approving this particular use case for a bot. Chenzw  Talk  02:10, 25 July 2020 (UTC)[reply]
Here is the description that I gave to @Chenzw: off-wiki : "essentially, whenever an ip edits, it is checked through several proxy apis, and if it is determined to be a proxy, it is blocked. ". Now that we are on-wiki, some questions should be answered:
1. How long will the block be? (My opinion: 1 month, this is the standard length and if it still reports to be a proxy at the end it can just block for another month)
2. Should it revert the IP's edits? Or just block (My opinion: Just block, the use of a proxy alone does not guarantee an LTA (other, behavioral evidence is required that a bot can't find) and it doesn't require an admin to revert, only block)
3. Should it post a block notice on the IPs talk page? (My opinion: No, this is mainly intended for an LTA who frequently visits old talk pages)
Other than this, descisions are probably left up to the person operating the bot. 'quick example' can be found at https://gist.github.com/Naleksuh/914ab4ac3b0277e44c29c862383974e5 - although this only includes one service and does not block the associated proxies (that is in progress). Hope this clears up anything! Naleksuh (talk) 03:48, 25 July 2020 (UTC)[reply]
If we were going to do this, and I am not entirely sure we should get into having admin bots on simple. We should just see if we can't get a copy of the one operating on en.wiki to operate here if that botop is comfortable giving it to one of the admins here who are experienced in running bots. (they may not be because of the special sauce used to find said proxies and wanting to keep it out of the wrong hands) -Djsasso (talk) 22:45, 28 July 2020 (UTC)[reply]
@Djsasso: Would you be referring to User:ProcseeBot or User:ST47ProxyBot? Also, both of these bots do not use rc-relayed information and they both miss tons of proxies (I would say about 90%+ of the editing proxies), so using them here would be very ineffective. Naleksuh (talk) 06:13, 31 July 2020 (UTC)[reply]

I would like to create a new bot[change source]

Hello, I’m Astronomyscientist124 and I am here to request a new bot on the Simple English Wikipedia. I would like it to be called AstronomyBot and it’s task is to block vandals, create articles on science and make users administrators. I would like to operate the bot if that is okay, thanks.Astronomyscientist124 (talk) 05:09, 12 October 2020 (UTC)[reply]

I am not a crat, but none of those are tasks we would want a bot to do. Blocking users and making users admins requires admin and crat rights, respectively, and those rights are not given to bots. We also don't want bots to create articles. --Auntof6 (talk) 05:56, 12 October 2020 (UTC)[reply]
Bots are there to do routine tasks, that can easily be automated. So far, we have a bot to report vandals. we generally frown on automatic article creation (Articles also need to be maintained). Making someone an administrator (or giving out other "advanced rights") is something that needs a community discussion, and therefore cannot be done by a bot. --Eptalon (talk) 02:24, 14 December 2020 (UTC)[reply]

EnWikiCopyBot[change source]

  • Contributions
  • Operator: Fnafpuppetmaster
  • Programming language: JavaScript / Node.js
  • Function: Copyright
  • Description: This would monitor any new pages created to see if it has been copy-pasted from enwiki, mark it as QDA3, and will give them tips on how to translate and attribute from enwiki via a talkpage message. While I am a aware that Filter 14 exists, the page would not need to be manually looked at as it only runs when the page is a direct copy-paste.

--rollingbarrels (talk) 02:10, 14 December 2020 (UTC)[reply]

Denied. Copy pastes are not automatically QD A3s. Copy pastes are perfectly acceptable if the language is not complex and the page is attributed. -Djsasso (talk) 20:09, 14 December 2020 (UTC)[reply]

JarBot[change source]

  • Contributions
  • Operator: جار الله
  • Programming language: Python
  • Function: Create articles, Categories, Templates etc.
  • Description: I would like to create a stub articles based on wikidata and create and add categories and templates to articles based on enwiki. It gonna be a long way but with help form the community we will reach the goal. Example from arwiki by the bot.

--جار الله (talk) 17:54, 22 November 2020 (UTC)[reply]

I know you are a very trusted user elsewhere and JarBot is quite useful. However, if its pages are like Gylfi Einarsson you just created, I will say no, there are red categories (12) and unused template here which isn't helpful. One liner pages cannot help readers much. If the pages can be of more information, and better in quality, surely this can be an idea. Note that community typically don't like bot editing, see Wikipedia:Simple_talk/Archive_132#Gay_Yong_Hernandez_and_systematic_creation_of_stubs_from_IPs. Thanks for your help anyway. Camouflaged Mirage (talk) 18:11, 22 November 2020 (UTC)[reply]
Hello @Camouflaged Mirage: we can work on quality and information. and as I said the bot will create the categories and templates and I see that the Infobox is not linked to wikidata, I will work on it if I get the approval. there is a lot of work to do. and you can give me notes about required information and I see if I can add it to the page. also the articles not only about humans there will be articles about cities and information will be with different quality. regards--جار الله (talk) 18:34, 22 November 2020 (UTC)[reply]
@جار الله The categories here we typically don't create too much categories, we keep it to minimum and only when there are 3 pages, we create a category. As of templates, we try to create only when there are a lot of blue links etc, a totally redlink template is quite useless. I can't decide, this needs a crat, I am just commenting. Please also read through the links on the welcome message I left on your talkpage. Best, Camouflaged Mirage (talk) 18:51, 22 November 2020 (UTC)[reply]
Denied. Creating articles by bot on this wiki has been banned by community discussion in the past. It has caused us to be flooded with tiny one sentence stubs that were created just from data tables that were no help to anyone and caused a lot of work deleting many thousands of them in cleanup.. -Djsasso (talk) 16:45, 23 November 2020 (UTC)[reply]
@Djsasso: Okay, what about create and add categories if there is three or more pages need the category? also added category to the articles based on enwiki if the category exist.--جار الله (talk) 17:24, 23 November 2020 (UTC)[reply]
@Djsasso: Do I need to apply new request?--جار الله (talk) 05:27, 16 December 2020 (UTC)[reply]
@جار الله:Non-crat comments: How would a bot determine what categories needed to be created? Having determined that, how would a bot know how to set up the category? We wouldn't want categories created without being properly categorized and formatted themselves. As for adding categories based on enwiki, we don't have the same category structure here, so you couldn't add cats just because they're on enwiki. In short, you'd need to give more detail about how this would work. Finally, I suspect the crats would be hesitant to approve a bot for a user who has only made 7 edits on this wiki. --Auntof6 (talk) 05:55, 16 December 2020 (UTC)[reply]
How would you determine that? If you look at the category Category:Cities in Iowa, there are currently over 900 pages in the category. Most of the pages are 2-3 sentences long, and were possibly created automatically. If I look at en:List of largest Iowa cities by population, the top 5 are Des Moines, Iowa (215.000, state capital), Cedar Rapids, Iowa (126.000), Davenport, Iowa (101.000), Sioux City, Iowa (83.000) and Iowa City, Iowa (75.000). Note that Iowa is just an example, the situation looks similar for many U.S. states. Also note that Davenport is a good article on EnWp, but all we have is a 2-3 sentence stub. If I write a program that magically re-classifies Davenport, and perhaps adds an infobox if it isn't there yet, this will not solve the fundamental problem: 35 total edits since 2009, about 3-4 a year. 65% of the edits by bots. 24 page views in the last 60 days. (full statistics) - Remember, we are talking about the third-largest city in Iowa. And no, Iowa is just a random example, I might also have picked any other U.S. state. Breaking up into counties will leave us with an endless number of counties, but will not solve the problem of the article. --Eptalon (talk) 11:41, 16 December 2020 (UTC)[reply]

@Eptalon and Auntof6: I understand the "BotDenied" of creating articles, now I ask for approval of create and add categories, here's for more details:

Thanks.--جار الله (talk) 23:13, 16 December 2020 (UTC)[reply]

We try not to create as many categories as en.wiki would have. This is one of the areas where we are simple. So I don't think a bot would be a good fit with this cause the bot can't make judgement calls on if a particular category should be split up into sub-categories. I would end up denying this request as well. Especially for a user who has no editing history here and doesn't understand the differences between here and en.wiki, which would be very necessary for a task like this. -Djsasso (talk) 14:31, 18 December 2020 (UTC)[reply]
@Djsasso: I guess just adding the categories wouldn't be a problem. However, if you will denying even adding part you may denying this request. Greetings.--جار الله (talk) 20:18, 18 December 2020 (UTC)[reply]

MajavahBot[change source]

  • Contributions
  • Operator: Majavah
  • Programming language: pywikibot, sql
  • Function: Create redirects from "NAME (film)" to "NAME (movie)"
  • Description: Creating these kind of redirects would allow just changing en. to simple. on the URL when "jumping" from enwiki to here. Naleksuh previously had a script to do this but he hasn't run it recently and the amount of missing redirects has already gone over 50. I've written my own script to do exactly the same.

--Majavah (talk) 14:38, 11 October 2020 (UTC)[reply]

Greetings. I have been away from home for a while now. However, I have since returned and will create film->movie redirects now. No prejudice against discussing new owners or tasks once it is done. Naleksuh (talk) 19:44, 13 October 2020 (UTC)[reply]
I am not sure we even need a bot to do this. As was mentioned in the original discussion for this, it is something that probably should be done by hand as needed. In the first instance we allowed the script because there was so many and it just made things faster but I am not sure we would want it to continually happen with a bot account. -Djsasso (talk) 17:44, 14 October 2020 (UTC)[reply]
Well while Naleksuh was inactive, there were over 50 that needed doing. It is clear this is not being done manually by anyone and will simply never be created without some sort of script. That is worth noting. --IWI (talk) 10:57, 15 October 2020 (UTC)[reply]
To me, this is a "nice-to-have", not something we really need, so saying that they "needed doing" is overstating things. --Auntof6 (talk) 11:01, 15 October 2020 (UTC)[reply]
True. It is not exactly a very important task that needs doing. --IWI (talk) 11:03, 15 October 2020 (UTC)[reply]
Yeah, Auntof6 states what I was trying to say better. Creating these redirects is a nice thing, but not at all a necessary thing. No more so than many other tasks we have. ie We have articles of all types that have different names than their English counter parts. The languages sidebar makes switching between the en and simple version quick and easy. While I do often just switch the URL there really is a solution for this already on the side of the page. -Djsasso (talk) 16:14, 15 October 2020 (UTC)[reply]
It is difficult to do by hand, on a wiki with 175k articles you would need an automated process to find what titles to create. And at that point, you might as well create them automatically as well. That said, what we currently have is semi-automated, and I don't see a clear need to make it fully automated. Naleksuh (talk) 01:12, 18 October 2020 (UTC)[reply]