Jump to content

Community Wishlist Survey 2017/Bots and gadgets

From Meta, a Wikimedia project coordination wiki
Bots and gadgets
10 proposals, 231 contributors



Deploy Article Alerts to other languages

  • Problem: Article Alerts is an automated subscription-based news delivery system designed to notify WikiProjects and Taskforces when articles tagged by their banners or placed in their categories enter various formal workflows (such as Articles for Deletion, Requests for Comments, Peer Review, and many more). See WikiProject Physics/Article alerts for example. This is currently an English-Wikipedia exclusive, and is maintained by one user (en:User:Hellknowz), meaning the bus factor leaves the whole project in a fragile state.
  • Who would benefit: Every edition of Wikipedia. For scale, on the English Wikipedia, 1467 WikiProjects and Taskforces subscribed to the Article Alerts system. Virtually every active project is subscribed, and the system is one of the best lines of defense against improper deletions and one of the best ways to advertise ongoing high-level discussions to communities of interest.
  • Proposed solution: Have the WMF / larger Wikimedia community create a more solid and scalable framework for Article Alerts that can also be deployed to all languages. Maybe it's not possible to have something as customizable as the English Wikipedia's implementation, but there are loads of things that could be ported and deployed. (Edit: See en:Wikipedia:Article alerts/Roadmap for a tentative roadmap.)
  • More comments:
  • Phabricator tickets:

Discussion

[edit]
  • I rank this as one of the most useful watchlist tools on en.wikipedia, and the bus factor worries me, although the code could presumably be shared even as is. However the main barrier to overall global implementation is probably on places like simple.wikipedia where there are no WikiProjects and the category’s are generally incomplete. Therefore it will probably be of more use on Wikipedia language editions with a larger userbase like de.wikipedia. A Den Jentyl Ettien Avel Dysklyver (talk) 15:08, 9 November 2017 (UTC)[reply]
To clarify, there is redundancy in code access (two people have it), the problem is that there's only one coder. This makes new features / bug fixes /general maintenance dependent on the will, time, and capabilities of one person. Running the bot on the toolforge would also be great and make the bot more reliable during holidays if there's a power failure and such. The bot clearly is more useful for the larger Wikipedias (certainly all 10 Wikipedias feature on the main landing page would greatly benefit from this), but if a framework can be designed and ported, it wouldn't be a lot of work for dedicated subcommunities on smaller wikis to benefit from this as well even if it's easier to follow every discussion on a smaller Wikipedia. Headbomb (talk) 18:33, 9 November 2017 (UTC)[reply]
Not even all larger Wikipedias use WikiProjects, though. It's not only a size thing, but also history (how have we organized) and editing culture. It's probably good for editors to be keenly aware of the fact that if they don't have WikiProjects, the infrastructure for something like this isn't there on their home wiki, and it won't work for them. /Julle (talk) 12:29, 4 December 2017 (UTC)[reply]
@Julle: To a point. You don't have to be organized by Wikiproject, you can be organized by categories for instance, or organize via any other grouping of topics. For instance, you could have alerts based on the transclusion of an infobox, such as en:Template:Infobox biography. So even if you don't have a Wikiproject structure in your own wiki, there are still ways this could be used. That's part of what the proposal is. Have the WMF take over, and have a team that's dedicated to making it work for the specific needs of other (non-en:wiki) Wikipedias. Headbomb (talk) 20:20, 4 December 2017 (UTC)[reply]
  • I love Article Alerts! It is a great tool that can be used for informing large groups of editors about issues that interest them without annoying others. It is also wonderful for those interested in collecting wiki-statistics. Wikiprojects that have been set up to look after a group of articles can add Article Alerts to the wikiproject with minimal effort, and have the Article Alerts available to anyone who knows where to look for them. There was a good Signpost article about Article Alerts back in 2009.
The en.wiki has thousands of Wikiprojects (the Signpost has many articles about individual wikiprojects). Wikiprojects are a great way for editors to find other editors interested in a certain topic, which can be a great motivator for many. Tools such as Article Alerts help shift the burden of running wikiprojects from human editors to BOTs. This in turn helps retain editors since much of this task is repetitive/tedious and tends to burnout humans.
I am shocked to hear that Article Alerts depends on one individual editor. I hope the Wikimedia Foundation has taken out an enormous life insurance policy on this individual. Ottawahitech (talk) 17:17, 12 November 2017 (UTC) Please ping me[reply]
  • I would love WMF to make this as a wiki-compatible configurable tool (at least to English WP for start) rather than an external bot. The benefits are too numerous for me to list, but multiple languages is certainly one of them. This would be much too big of a project for me to do to convert it into an extension or tool labs bot or something of the sort. It's been quite a few years since coding AAB and real life certainly means that one volunteer coder for the project is unfeasible for any serious expansion into other languages or complex workflows and on-wiki features. The code is also pretty atrocious and team-unfriendly now that I can look at it with some 10 years of experience. It was more of a "wow, this sounds like a cool bot I can code" rather than a plan for a community-driven open source tool. The things that would be different/new/incompatible for a tool that directly works with database and/or MediaWiki is again too numerous to list. I can fix and keep the bot running when issues occur, but getting to new features and even bugs is a hurdle that could be abruptly terminated by a bus. —  HELLKNOWZ  ▎TALK  ▎enWiki 23:26, 1 December 2017 (UTC)[reply]
  • Don’t underestimate yourself User: Hellknowz (and you too User:Headbomb). The code you wrote ten years ago may not be up to your standards of today, but it is robust and has served Wikipedia’s community well. I worry that the code the WMF staff replaces it with will be inferior in terms of features.
Also I wanted to add a comment about statistics, which are not harvested and put to good use. For example, Alerts of AFD discussions show the number of participants in each deletion discussion when they are displayed on the project(s) page(s). However, no one is capturing this information for statistical purposes as far as I know. Wouldn’t it be nce to know how many articles are deleted overall based on the votes/consensus of one or two people? Ottawahitech (talk) 19:39, 2 December 2017 (UTC) Please ping me [reply]
@Ottawahitech: The bot has served the community well, we're well aware of that. I gave a restropective at Wikimania this summer, where WMF devs expressed interest in taking over [assuming it's endorsed on the community survey]. To be frank, the code's status has gotten to a point that whatever new feature we wanted to have, we can't have either because it would blow up the whole bot because of en:spaghetti code, or take too much of Hellknowz' time to implement. Very little has been done on en:WP:AALERTS in over two years. WMF taking over would mean much better integration (e.g. possibly even integrate this with en:WP:Notifications), have much better/reliable performance [we need manual restarts on our personal machines if the bot crashes, if we're busy, that can take a few days before we get to it], and allow other languages to benefit from it. It might have less features at first, but over time they would likely do more, and do it better. Headbomb (talk) 19:51, 2 December 2017 (UTC)[reply]

Voting

[edit]

Wikidata reference "stated in" from url bot

  • Problem: Many Wikidata items with reference urls do not have a "stated in" property. This makes it hard to quickly glance at item references to determine source and potential trustworthiness by seeing a name. Also, makes it harder to potentially run queries that could generate a list of potentially notable items by Wikidata references to create a worklist for edit-a-thon organizers using the query engine.
  • Who would benefit: community and people running workshops
  • Proposed solution: a bot is created that sees a url from P184 on references like www.nytimes.com or huffingtonpost.com and adds a "stated in" of "New York Times" or "Huffington Post".
  • More comments: Beyond this, general improvement of referencing section of Wikidata would assist in credibility building, by encouraging more people to add references.
  • Phabricator tickets:

Discussion

[edit]
I don't know enough about bot development to do it. --LauraHale (talk) 10:49, 9 November 2017 (UTC)[reply]
I don't think we have to worry about the bot being too complicated, if it gets approved someone can probably do it. A Den Jentyl Ettien Avel Dysklyver (talk) 14:56, 9 November 2017 (UTC)[reply]
This should probably be tackled on Wikidata. Sourcing on Wikidata is currently done in different ways and isn't very consistent.
  1. Get agreement on the best and consistent way to source things.
  2. Inform bot operators of the consensus so they can update their code
  3. Run bot jobs to make existing sources more consistent
  4. Set up regular reports and bot jobs to improve sources.
I think the first step is actually the most difficult one. Feel free to copy this somewhere on Wikidata. Haven't really gotten around to starting this discussion. Multichill (talk) 16:51, 17 November 2017 (UTC)[reply]

P184 is doctoral advisor, you probably meant something else. --Tgr (WMF) (talk) 05:55, 18 November 2017 (UTC)[reply]

Maybe the property P248 "stated in". --X:: black ::X (talk) 14:08, 5 December 2017 (UTC)[reply]

Voting

[edit]

Make adding "Wikipedia in the News" easier

  • Problem: Wikipedia in the News is currently maintained in two locations: WP:Press coverage master list, and article talk pages with Template:Press. It's done manually with high overhead. It's something of a PITA to add stories so coverage is haphazard.
  • Who would benefit: all editors and journalists
  • Proposed solution: A web-based form for adding 'In the News' events. It automatically adds entries to the appropriate places (talk pages and master lists).
  • More comments: Data could be stored in Wikidata. Modify templates to draw from Wikidata. This will allow truly global 'In the News'
  • Phabricator tickets:

Discussion

[edit]
Initially that seems like a good idea, but with the vision if the external links were stored in WikiData there's no reason it couldn't become a global database of Wikipedia in the News, regardless of country or language. -- GreenC (talk) 21:02, 18 November 2017 (UTC)[reply]
  • Clarification The votes are trending negative and I don't understand the reasons given (or "reason" since they all cite the same reason). To clarify, a tool (website) is used to enter the "In the News" data into Wikidata, where a bot then automatically distributes the data. For example on Enwiki, it would automatically create the WP:Press coverage master list, and article talk pages with Template:Press. Other project languages might have different methods, but same scenario, a bot would automatically create whatever pages or templates they use, drawing from the centrally located "In the News" database which anyone from any language can update. -- GreenC (talk) 22:14, 2 December 2017 (UTC)[reply]
    • I don’t know any reason to store this data in Wikidata. It can be stored on the page where it’s currently stored (w:WP:PRESS), and a bot can distribute it without any fancy tool. This bot can be written by an enwiki bot owner for enwiki. I, as a Hungarian speaker, am not interested in Catalan-language news about Wikipedia, it makes no sense to maintain it internationally. –Tacsipacsi (talk) 09:32, 3 December 2017 (UTC)[reply]

Voting

[edit]

Combined Desktop/Mobile/App-View for the Pageviews Analysis

  • Problem:

On of the most common questions when it comes to pageviews analysis is, to compare the percentages of the pageviews from different platforms (mobile, desktop, app). With the current tool this is quite complicated, since the tool is not able to display more than one platform at a time. Therefore a direct graphical comparison is impossible.

  • Who would benefit:

All Readers and Editors interested in pageview-statistics

  • Proposed solution:
  1. Create an additional view, where the bars of all the different platfoms are stacked in one diagram.
  2. Or convert the "platform"-selection menu from a singe select dropdown to multiselect radio buttons, so that it is possible to configure a customized diagram
  • More comments:
  1. I agree with the problem and the benefits of a solution. I'm just not sure that the proposed solution is the optimal one. I'm a newby and I have a newby perspective on everything. FWIW I have some experience in 'Application Portfolio Management' which undoubtedly influences my views. From my perspective, I feel that 'the movement' would be better served (now and in the future) by moving towards a much more consolidated, integrated set of apps (functionally, technically and in terms of user experience) rather than by maintaining/improving/expanding the current large collection of disparate individual gadgets/tools. The current gadgets and tools are difficult to find and use for newbies and are limited in scope and functionality. Today (after hours of searching) I discovered the Wikistats 2.0 Design project. The project currently has a prototype up and running for evaluation and will be released soon. It's the first version and improvements/extensions will surely follow. It seems to me that any solutions to limitations on pageviews/platforms would be better integrated with Wikistats 2.0 or follow-up versions. The benefits would be that Wikimedians have just one app for Analytics on any selection of pages/projects/countries/languages/audiences(editor/reader) and devices. I'm used to working with Google Analytics and I hope that something similar becomes available through Meta-Wiki Just my opinion. If this solution proposal is quick/cheap fix then it may be a worthwhile short-term solution. Otherwise, I think the effort would be better spent on a solution consolidated/integrated with other improvements to Analytics.

Mike Morrell49 (talk) 23:18, 29 November 2017 (UTC)[reply]

  • Phabricator tickets:

Discussion

[edit]

User:West.andrew.g does this for the top 5000 medical articles on a weekly bases.[1] And the overall top 5000 articles.[2] Doc James (talk · contribs · email) 01:54, 5 December 2017 (UTC)[reply]

Voting

[edit]

Make Flow database accessible on Tools Labs

  • Problem: Flow is a new extension that totally change user experience about discussions, and technical specifications. One of the important problem of the usage of this extension on Wikimedias wiki is that tools, bots and other programs hosted on Tools labs servers can't access easily to these datas. For example, the web service that provide lots of statistics about users can't deal with Flow edits, that are not take into account.
  • Who would benefit: Users of the many applications hosted on the tools. Many bots and other tools could benefit indirectly from it.
  • Proposed solution: Make Flow database available / accessible on Labs/Tools, so developers can access to these datas and show them to readers.
  • More comments:

Discussion

[edit]

Voting

[edit]

Turn UTCLiveClock into an extension

  • Problem: The UTCLiveClock is one of the most used gadgets across the projects (typically within the top 5). There are 3 problems with it:
    1. It loads after the rest of the page loads, which causes the other links in the user toolbar to shift, often leading to people accidentally clicking the "Log out" link.
    2. Some projects don't have the gadget.
    3. Because it's a gadget rather than a preference, it won't be available as a global preference.
    Turning it into a extension will solve all three of these problems.
  • Who would benefit: All current and future users of UTCLiveClock
  • Proposed solution: Turn it into an extension with its own preference. Migrate people who are using the existing extension.
  • Phabricator tickets:

Discussion

[edit]
  • I added a similar idea to mw:Extension:Purge the other day: https://github.com/Hutchy68/Purge/issues/16 Perhaps the UTCclock and MediaWiki:Gadget-purgetab.js could be combined into one extension? Sam Wilson 00:30, 9 November 2017 (UTC)[reply]
  • I have all sorts of things up there in my top bar, which looks like: A Den Jentyl Ettien Avel Dysklyver - Alerts (0) - Notices (0) - Talk - Sandbox - AfDs Closing - Page Curation - AfDs Today - AfDs All - Preferences - Beta - Watchlist - Contributions - Log out - 15:10:00, and they load all at different times over a few seconds, anything to get them to load at the same time would be useful. A Den Jentyl Ettien Avel Dysklyver (talk) 15:13, 9 November 2017 (UTC)[reply]
  • It's a bit hacky, but you can use peer gadgets to put in the space with CSS before the JavaScript loads, so that things don't jump around. This is already being done with the UTCLiveClock gadget on English Wikipedia and mediawiki.org. If you wanted to source the gadget as a user script (in global.js, for instance), you'd have to add the necessary CSS as well (e.g. see the bottom of User:MusikAnimal/global.css). But an extension is still better! I think it makes just as much sense to include it as part of Purge, too — MusikAnimal talk 20:17, 9 November 2017 (UTC)[reply]
  • I understood that Performance team were working to remove the purge action entirely from MediaWiki. This seems to go against that work? Jdforrester (WMF) (talk) 02:16, 15 November 2017 (UTC)[reply]
    That's being tracked in phab:T56902, and seems like it'd be a reasonably complex and lengthy project (that ticket's been open for four years). On the other hand, cleaning up these gadgets would be a pretty easy thing to do (and most of the work has already been done). So I think it's still worth it, even if it's only used for a year or two. Maybe. Happy to be convinced otherwise though! Sam Wilson 06:47, 15 November 2017 (UTC)[reply]
    Fair. I'm just uneasy about giving such high profile endorsement (even if we don't think of it that way) to a feature we're already planning to kill… People might feel misled. Jdforrester (WMF) (talk) 20:08, 15 November 2017 (UTC)[reply]
    The Performance team is working on making sure that purges are never triggered with GET requests. Getting completely rid of them is more of a long-term aspiration (that would require major changes in our caching and parsing architecture) than something being worked on, I think. --Tgr (WMF) (talk) 06:09, 18 November 2017 (UTC)[reply]
  • I agree that we should assume that purging will eventually be deprecated (even though it might take a very long time for that to happen). Thus I would favor implementing this as a separate extension that has purging as an optional feature, rather than turning the UTC clock into an optional feature of the Purge extension. Kaldari (talk) 22:46, 16 November 2017 (UTC)[reply]
  • The clock thing is one of those usability train wrecks that have been around so long that we have mostly stopped noticing how bad they are. Using a clock to purge the current page, and then putting it into the personal toolbar, just makes no UX sense whatsoever. As a clock, it has poor usability anyway; the reasonable approach would be something like Google Calendar's world clock where you can set which timezones you want to see (plus maybe integration with timestamps on the page). But I doubt anyone cares about the clock part anyway. The purge link should just live in the page action dropdown. --Tgr (WMF) (talk) 06:09, 18 November 2017 (UTC)[reply]
    • @Tgr (WMF): Actually, I only care about the clock part (which I use a lot). I don't even want purging ability. I don't think setting the time zone is needed. I just need UTC time so that I can tell when various on-wiki actions occurred. Kaldari (talk) 18:19, 20 November 2017 (UTC)[reply]
  • 1) The visible layout change after page-load was fixed on mediawiki.org. 2) Creating an extension seems overkill for this feature, especially because it seems to bypass the existing project for "Global gadgets" which would solve this. This could become one of the first gadgets to be ported to WikimediaGadgets.git. 3) It is indeed safe to assume the current Gadgets system will not support "Global Preferences" because they are per-wiki (similarly named gadgets could be different things on different wikis). However, it is also safe to assume that this restriction does not apply to global gadgets. As such, if this gadget were a global gadget, it would be trivial to add its preference to the list of global preferences. --Krinkle (talk) 04:33, 21 November 2017 (UTC)[reply]

Voting

[edit]

Convert AWB into a special page

  • Problem:
    • Malfunctions
    • No translations
    • Need to download program and install continuous updates
  • Which users are affected?
  • Who would benefit:
    • All program users (including administrators)
  • Proposed solution:
  • More comments:

Discussion

[edit]

Voting

[edit]

Fix search in AWB

  • Problem: For AutoWikiBrowser it is impossible to make the list using insource with regexp and other keywords. The list has to be done in other complex ways, although in Wikipedia everything works through a simple search.
  • Who would benefit: AWB editors
  • Proposed solution: Improve the search in AWB so that it works with keywords and so that you can pick up a list of what you can get in the search on the site. Would be useful search option in a separate namespace.
  • More comments:
  • Phabricator tickets:

Discussion

[edit]

insource: does work in AutoWikiBrowser. It used to not work, but version 5.9.0.0 already supports it (and maybe some earlier versions too). —Tacsipacsi (talk) 18:50, 2 December 2017 (UTC)[reply]

Why has this proposal not been retracted yet? --bdijkstra (talk) 19:38, 8 December 2017 (UTC)[reply]

Voting

[edit]
  • Problem: Changes to MediaWiki code related to parsing can leave links tables out of date. Sometimes code changes will add new tracking categories, typically related to error tracking. But until the page is edited, null edited, or purged with links update, the page will not show up in the category. Some pages do not get edited or refreshed for years, which means that errors in pages will go undetected and unfixed.
  • Who would benefit: Gnomes; WMF developers who want to move to new technologies but need editors to clean up existing pages first; readers who encounter strange page behavior.
  • Proposed solution: Null-edit all unedited pages, or do some equivalent action, on a known periodic basis, perhaps once a month. Publicize the schedule so that it is known that any new template changes or MW code changes may take N weeks to be reflected in all relevant locations on the corresponding MW site.

Discussion

[edit]
  • In theory that sounds like a great idea, which I support. However, it worries me that that could lengthen the category update time even more, and right now they're SLOW. What can we do to mitigate this risk?--Strainu (talk) 12:31, 7 November 2017 (UTC)[reply]
    • The proposal is for the back-end software (MediaWiki itself, or a job queue, or something similar) to null-edit pages that are stale. The proposal, if implemented, would shorten category update times, not lengthen them. Jonesey95 (talk) 23:48, 1 December 2017 (UTC)[reply]
  • I like it but who would be responsible for the Null editing/updating the categories you reference? Zppix (talk) 19:37, 7 November 2017 (UTC)[reply]
  • That's a big problem on Commons. In general, most (if not all) pages the categories of which are assigned via templates suffer from that. Right now about 1,400,000+ pages on Commons are not categorised correctly due to that problem. If a category is changed by altering the template a touch on every template-categorised page is necessary to fix the categorisation that else stays pointing to a redirecting page. Another example: Because c:Category:Non-empty disambiguation categories had become completely useless keeping about 8000 empty categories I started many months ago a weekly touch run for to keep it usable. --Achim (talk) 20:24, 8 November 2017 (UTC)[reply]
    • I think the issue with c:Category:Non-empty disambiguation categories is that {{PAGESINCAT:... is not considered a dynamic parser function (like e.g. {{CURRENTDAY}}) which would cause the pages to be reparsed regularly. We also don't keep track of links for pagesincat, so we can't purge when someone adds something to a category via job queue. Given this is using {{PAGESINCAT:... for the current category, we probably wouldn't even need to keep track of links, only if the current page is checking how many cats are in itself, so we could probably fix that much easier than purging all pages. BWolff (WMF) (talk) 22:23, 28 November 2017 (UTC)[reply]
  • I would support this, but only if the null edit does not appear in edit history's, and happens at a long schedule, something like monthly or yearly runs, repeating, with all articles spread over the month/year to avoid server load. A Den Jentyl Ettien Avel Dysklyver (talk) 14:51, 9 November 2017 (UTC)[reply]
  • null edit can be done using pywikibot's touch.py. But this is solution only for smaller wikis, ~100k pages takes ca 10 hours. JAn Dudík (talk) 11:54, 10 November 2017 (UTC)[reply]
  • I think it would be useful to add touch capability to AutoWikiBrouwser, as proposed in phabricator:T167283, so there is more than one way to touch pages. --Jarekt (talk) 19:36, 15 November 2017 (UTC)[reply]
  • I'm doubtful how practical this is for all pages on large wikis. I have no idea how long such a reparse would take. It wouldn't surprise me if we're talking months to run through all pages. Maybe even years. BWolff (WMF) (talk) 22:23, 28 November 2017 (UTC)[reply]
    • The first step in implementing this proposal would be to investigate a few different methods for doing this refresh and assessing their practicality. The status quo is that some pages never get updated, so even "years" would be an improvement, but I think we can get it down into the "months" or even "weeks" range with some clever work. Jonesey95 (talk) 23:48, 1 December 2017 (UTC)[reply]

Voting

[edit]

Commons deletion notification bot

  • Problem: When images that are used in a Wikipedia article are put up for deletion on Commons, the Wikiproject or article associated with those images are not notified.
  • Who would benefit: The Wikiprojects may have connections with the uploading institution and thus able to get permission or may be able to find a substitute for the image being deleted. Either way it will improve collaboration between WP and Commons.
  • Proposed solution: Add links to these Commons discussions to the Article Alerts for Wikiprojects and / or the article talk page
  • More comments:

Discussion

[edit]

Supposedly there is something on Fr WP that does this.[3] Doc James (talk · contribs · email) 21:49, 13 November 2017 (UTC)[reply]

Happened to notice this discussion right after I posted about the same thing at w:Wikipedia:Village pump (idea lab)#Notification system for files on Commons nominated for deletion. If there's enough support for it then I'm happy to tackle this task -FASTILY 23:46, 16 November 2017 (UTC)[reply]

Daniel Kinzler's bot CommonsTicker did that a long time ago, but fell into disrepair. --Tgr (WMF) (talk) 05:50, 18 November 2017 (UTC)[reply]

phab:T91192 may be an alternative. Jo-Jo Eumerus (talk, contributions) 10:10, 26 November 2017 (UTC)[reply]

For article Alerts, as they currently exist, you'll have to talk to en:User:Hellknowz and make a Features Request for Article Alerts. However, see this Article alerts-related proposal. A dedicated bot putting notices on talk pages could be handled at en:WP:BOTREQ. Headbomb (talk) 04:38, 29 November 2017 (UTC)[reply]

I made the proposal on Wikiversity, which was a response to a problem: Users trust Commons content, use it, and years later someone finds (or imagines) a copyright problem and the Commons file is deleted, then Commons Delinker comes and removes the link. At that point, all the file information from Commons is now deleted. (Really, that information should never be deleted even if the image itself is removed). The original user may be long gone. So content is damaged and fixing it can be time-consuming, creating a possibly unnecessary maintenance cost). We could go to Commons and ask an admin for a copy of the file, but that's cumbersome and not actually legal, if the copyright failure was real. However, if the file is hosted on Commons, there is no doubt that we can legally copy it to Wikiversity. We just don't do it because it seems unnecessary. Later, if the file is deleted from Commons, our own copy could be tagged, if appropriate, for fair use (or deleted if fair use is not appropriate, whatever it takes to satisfy the EDP requirements -- but Wikiversity tends to have liberal fair use practice.) So the proposal was to copy all WV-used Commons content to Wikiversity, by local bot. However, if Commons would notify Wikiversity of pending deletions, for all files used on Wikiversity, we wouldn't need that dual hosting. The NonFreeWiki proposal would be far more efficient if it were simply a filespace on Commons, where used files were moved. That's basically a no-cost solution. --Abd (talk) 14:48, 1 December 2017 (UTC)[reply]

Voting

[edit]