Shortcut: WD:RBOT

Wikidata:Bot requests

From Wikidata
Jump to navigation Jump to search
Bot requests
If you have a bot request, add a new section using the button and tell exactly what you want. To reduce the process time, first discuss the legitimacy of your request with the community in the Project chat or in the Wikiprojects's talk page. Please refer to previous discussions justifying the task in your request.

For botflag requests, see Wikidata:Requests for permissions.

Tools available to all users which can be used to accomplish the work without the need for a bot:

  1. PetScan for creating items from Wikimedia pages and/or adding same statements to items
  2. QuickStatements for creating items and/or adding different statements to items
  3. Harvest Templates for importing statements from Wikimedia projects
  4. OpenRefine to import any type of data from tabular sources
  5. WikibaseJS-cli to write shell scripts to create and edit items in batch
On this page, old discussions are archived. An overview of all archives can be found at this page's archive index. The current archive is located at 2021/09.
Filing cabinet icon.svg
SpBot archives all sections tagged with {{Section resolved|1=~~~~}} after 2 days.

Import Treccani IDs[edit]

Request date: 6 February 2019, by: Epìdosis

Task description

At the moment we have four identifiers referring to http://www.treccani.it/: Biographical Dictionary of Italian People ID (P1986), Treccani ID (P3365), Treccani's Enciclopedia Italiana ID (P4223), Treccani's Dizionario di Storia ID (P6404). Each article of these works has, in the right column "ALTRI RISULTATI PER", a link to the articles regarding the same topic in other works (e.g. Ugolino della Gherardesca (Q706003) Treccani ID (P3365) conte-ugolino, http://www.treccani.it/enciclopedia/conte-ugolino/ has links also to Enciclopedia Italiana (Treccani's Enciclopedia Italiana ID (P4223) and Dizionario di Storia (Treccani's Dizionario di Storia ID (P6404)). This cases are extremely frequent: many items have Biographical Dictionary of Italian People ID (P1986) and not Treccani ID (P3365)/Treccani's Enciclopedia Italiana ID (P4223); others have Treccani ID (P3365) and not Treccani's Enciclopedia Italiana ID (P4223); nearly no item has Treccani's Dizionario di Storia ID (P6404), recently created.

My request is: check each value of these identifiers in order obtain values for the other three identifiers through the column "ALTRI RISULTATI PER".

Discussion

Fix local dialing code (P473) wrongly inserted[edit]

Request date: 7 November 2019, by: Andyrom75

Task description

Several entities has a wrong value for the local dialing code (P473) according to the format as a regular expression (P1793) specified in it: [\d\- ]+, as clarified "excluded, such as: ,/;()+"

Typical examples of wrong values, easily identified are the following two:

  1. local dialing code (P473) that includes at the beginning the country calling code (P474)
  2. local dialing code (P473) that include at the beginning the "optional" zero
  • Case 1 can be checked looking for "+", when present, should be compared with the relevant country calling code (P474) and if matched, it should be removed
  • Case 2 can be checked looking for "(" and ")" with zeros inside. If matched it should be removed
Discussion
Request process

Cleaning of streaming media services urls[edit]

Request date: 12 December 2020, by: Swicher

I'm not sure if this is the best place to propose it but when reviewing the urls of a query with this script:

import requests
from concurrent.futures import ThreadPoolExecutor

# Checks the link of an item, if it is down then saves it in the variable "novalid"
def check_url_item(item):
    # Some sites may return error if a browser useragent is not indicated
    useragent = 'Mozilla/5.0 (X11; Linux i686) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.77'
    item_url = item["url"]["value"]
    print("Checking %s" % item_url, end="\r")
    req = requests.head(item_url, headers = {'User-Agent': useragent}, allow_redirects = True)
    if req.status_code == 404:
        print("The url %s in the element %s returned error" % (item_url, item["item"]["value"]))
        novalid.append(item)

base_query = """SELECT DISTINCT ?item ?url ?value
{
%s
  BIND(IF(ISBLANK(?dbvalue), "", ?dbvalue) AS ?value)
  BIND(REPLACE(?dbvalue, '(^.*)', ?url_format) AS ?url)
}"""
union_template = """  {{
    ?item p:{0} ?statement .
    OPTIONAL {{ ?statement ps:{0} ?dbvalue }}
    wd:{0} wdt:P1630 ?url_format.
  }}"""
properties = [
    "P2942", #Dailymotion channel
    "P6466", #Hulu movies
    "P6467", #Hulu series
]
# Items with links that return errors will be saved here
novalid = []

query = base_query % "\n  UNION\n".join([union_template.format(prop) for prop in properties])
req = requests.get('https://query.wikidata.org/sparql', params = {'format': 'json', 'query': query})
data = req.json()

# Schedule and run 25 checks concurrently while iterating over items
check_pool = ThreadPoolExecutor(max_workers=25)
result = check_pool.map(check_url_item, data["results"]["bindings"])

I have noticed that almost half are invalid. I do not know if in these cases it is better to delete or archive them but a bot should periodically perform this task since the catalogs of streaming services tend to be very changeable (probably many of these broken links are due to movies/series whose license was not renewed). Unfortunately I could only include Hulu and Dailymotion since the rest of the services have the following problems:

For those sites it is necessary to perform a more specialized check than a HEAD request (like using youtube-dl (Q28401317) for Youtube).

In the case of Hulu I have also noticed that some items can have two valid values in Hulu movie ID (P6466) and Hulu series ID (P6467) (see for example The Tower of Druaga (Q32256)) so you should take that into account when cleaning links.

Request process

Ontario public school contact info[edit]

Request date: 27 December 2020, by: Jtm-lis

Link to discussions justifying the request
Task description

https://www.wikidata.org/wiki/Wikidata:Dataset_Imports/_Ontario_public_school_contact_information

Licence of data to import (if relevant)
Discussion

request to import podcast identifiers (2021-01-03)[edit]

Request date: 3 January 2021, by: Sdkb

Link to discussions justifying the request
Task description

Several properties have recently been created (see e.g. Castbox show ID (P9005) for podcast identifiers), which are being used for the new w:Template: Podcast platform links on Wikipedia. I was told to come here to get help importing the identifiers for a bunch of podcast items.

Licence of data to import (if relevant)
Discussion

@Sdkb: Please use https://pltools.toolforge.org/harvesttemplates/ , this doesn't need to be done by a bot.Vojtěch Dostál (talk) 12:32, 15 June 2021 (UTC)

@Vojtěch Dostál: thanks for looking at this. I just looked at Harvest Templates and have no clue how it would fetch external database data on podcast identifiers. Could you advise? {{u|Sdkb}}talk 15:48, 15 June 2021 (UTC)
@Sdkb: I thought you meant to ask bots to import data from w:Template: Podcast platform links. Is that incorrect? In that case, where do you want to import the data from, and using what key to assign the identifiers to items?Vojtěch Dostál (talk) 19:24, 15 June 2021 (UTC)
@Vojtěch Dostál: Sorry for the confusion. The podcast platform links template uses data from Wikidata, so I'm looking for Wikidata to mass-import the various identifiers it includes, such as Apple podcasts ID. I'd hope at least some of those could be imported from e.g. Apple itself, but beyond that I'm not sure. {{u|Sdkb}}talk 19:32, 15 June 2021 (UTC)
@Sdkb: Hmm, I am not sure you were advised correctly by @Sic19:. This page is, to my knowledge, reserved for mostly mindless works which require bots. However, what you have in mind is a dataset import, a much more complex endeavour consisting of several steps - acquisition of CC0-licensed data, cleaning, pairing to existing entities, import etc. Your proposal is a Wikidata version of someone on Wikipedia asking for an article to be written :-) Vojtěch Dostál (talk) 19:39, 15 June 2021 (UTC)
@Vojtěch Dostál: Ah, thanks for the info. Given the backlog at w:WP:Requested articles, I'm guessing if the analogy holds my odds aren't too great of it being taken up anytime soon. But I'll look at the dataset imports page and see if there's a place to add a request. {{u|Sdkb}}talk 19:47, 15 June 2021 (UTC)
@Sdkb: If you can be a little more specific about what dataset you want imported and how it could be done I might be interested in picking it up. But generically importing any data using unknown means (how we would link the datasets) it's a bit too much to do. BrokenSegue (talk) 23:54, 15 June 2021 (UTC)
@BrokenSegue: The most significant IDs are probably Apple, Google, and Spotify, so if any of those seem to have open data, those would be the ones to import. The structures at those places are explained at Apple Podcasts podcast ID (P5842), Google Podcasts show ID (P9003), and Spotify show ID (P5916). If that's enough for you to go off of, it'd be fantastic to see those properties showing up more often at Wikidata items, and would in turn allow us to start using w:Template: Podcast platform links more widely on Wikipedia. But if the data isn't open or something, we might just be out of luck; the next step would have to be reaching out to the platforms for help (they should theoretically be eager to help us, as it gets their name onto a bunch of podcast Wikipedia pages). {{u|Sdkb}}talk 01:42, 16 June 2021 (UTC)
Request process

request to fix labels of humans - disambiguator (2021-01-24)[edit]

English labels for humans shouldn't end with a ")".

The following finds some 175 of them, all with "politician" in the label.

SELECT *
{
  hint:Query hint:optimizer "None".
  SERVICE wikibase:mwapi {
    bd:serviceParam wikibase:endpoint "www.wikidata.org" .
    bd:serviceParam wikibase:api "Generator" .
    bd:serviceParam mwapi:generator "search" .
    bd:serviceParam mwapi:gsrsearch 'inlabel:politician@en haswbstatement:P31=Q5' .
    bd:serviceParam mwapi:gsrlimit "max" .    
    bd:serviceParam mwapi:gsrnamespace "0" .    
    ?item wikibase:apiOutputItem mwapi:title  .    
  }
  ?item rdfs:label ?l.
  FILTER(REGEX(?l, "\\)$") && lang(?l)="en").  
}

Try it!

The usual fix would be to remove the disambiguator or make the label into an alias. The same can probably be done for other occupations/languages. --- Jura 16:10, 24 January 2021 (UTC)

@Jura1: I can code and run this, but would need a generic query to start from that would retrieve all entries regardless of occupation. No problem with stepping it over offsets if it takes a while to run. I think it would also need a bot request approval, since I don't think it falls under any of pi bot's other tasks. Thanks. Mike Peel (talk) 19:49, 24 March 2021 (UTC)
SELECT *
WITH
{
  SELECT ?value (count(*) as ?ct)
  {
    ?item wdt:P106 ?value
  }
  GROUP BY ?value    
  ORDER BY DESC(?ct) 
  OFFSET 0        
  LIMIT 50
}
AS %value
WHERE
{
  INCLUDE %value 
  hint:Query hint:optimizer "None".  
  ?value rdfs:label ?v . FILTER( lang(?v) = "en" ) 
  BIND( CONCAT( 'inlabel:"',?v,'@en" haswbstatement:P31=Q5') as ?search)
  { 
  SERVICE wikibase:mwapi {
    bd:serviceParam wikibase:endpoint "www.wikidata.org" .
    bd:serviceParam wikibase:api "Generator" .
    bd:serviceParam mwapi:generator "search" .
    bd:serviceParam mwapi:gsrsearch ?search .
    bd:serviceParam mwapi:gsrlimit "max" .    
    bd:serviceParam mwapi:gsrnamespace "0" .    
    ?item wikibase:apiOutputItem mwapi:title  .    
  }
  }
  ?item rdfs:label ?l.
  FILTER(REGEX(?l, "\\)$") && lang(?l)="en").  
}

Try it!

reference URL (P854)Holocaust.cz person ID (P9109) (2021-02-05)[edit]

Request date: 5 February 2021, by: Daniel Baránek

Task description

After intoducing Holocaust.cz person ID (P9109), reference URL (P854) in references can be replaced by this new identificator. The result of edits should be like this. It is 285,282 references. You can see all references, their reference URL (P854) value and value for Holocaust.cz person ID (P9109) here:

SELECT ?ref ?url ?id WHERE {
  ?ref prov:wasDerivedFrom [ pr:P248 wd:Q104074149 ; pr:P854 ?url ].
  BIND (REPLACE(STR(?url),"^.*/([0-9]+)[-/].*$","$1") as ?id)
  }

Try it!

Discussion


Request process

request to add identifiers from FB (2021-02-11)[edit]

Thanks to a recent import, we currently have more than >1.2 items where the only identifier is Freebase ID (P646). However, checking https://freebase.toolforge.org/ some of them have identifiers available there.

Samples:

See Wikidata:Project_chat#Freebase_(bis) for discussion.

Task description

Import ids where available. Map keys to properties if not available at Wikidata:WikiProject_Freebase/Mapping.

Discussion


Request process

request to update all ckbwiki article labels (2021-02-12)[edit]

Request date: 12 February 2021, by: Aram

Link to discussions justifying the request
  • There is no any discussion because I don't think the update of the labels require discussion.
Task description

Often, when moving articles, wikidata lables will not be updated to the current article names. So, we need to update all ckbwiki article labels on wikidata. For example, I moved this by using my bot, but it's label on wikidata hasn't been updated yet. ckbwiki has 28,768 articles so far. Thanks!

Licence of data to import (if relevant)
Discussion
  • @Aram: You mean
    • the sitelinks (to ckbwiki),
    • or the labels (in ckb),
    • or both?
When page moves on ckbwiki aren't mirrored here generally that means that the user moving them hasn't created an account on Wikidatawiki. You would need to log-in to Wikidata with your bot account at least once. --- Jura 14:09, 15 February 2021 (UTC)

@Aram: --- Jura 14:10, 15 February 2021 (UTC)

@Jura1: Really? I didn't know that before. Thank you for the hint! Although, it seems that my bot has been logged in in this edit, but the label has not yet been updated. However, regarding your question, we want to only update the ckbwiki labels. Thank you! Aram (talk) 15:06, 15 February 2021 (UTC)
  • It seems the account exists on wikidatawiki so the sitelinks to cbkwiki are updated (since Feb 9), so edits like this one you mentioned above are no longer needed.
    However, this wont have any effect on the label of the item in cbk at Wikidata. These need to be updated separately if deemed correct (by bot, QuickStatements or manually). --- Jura 15:36, 15 February 2021 (UTC)
Thanks! Aram (talk) 20:14, 18 February 2021 (UTC)
Request process

@Aram: if something still needs to be done but bot, you might want to detail it. --- Jura 09:33, 30 March 2021 (UTC)

@Jura1:, Thank you! Yes, already, we've seen a bot added missing article labels or update them on Wikidata, but it won't add/update any labels for a long time (I'm talking about ckbwiki). See here as an example. Here, we want to
  • update all article labels on Wikidata.
  • update article labels while moving the article to a new title if any bot can do it automatically and immediately.
    • If not, update them every 6 months or whenever the bot manager can run the bot.
  • update the category and template labels if the bot can.
But it is clear to ignore that parentheses after the title. For example, both en:Casablanca and en:Casablanca (film) labels are "Casablanca". That is all. Thank you again! Aram (talk) 11:38, 31 March 2021 (UTC)

request to change Belarusian language description from "спіс атыкулаў у адным з праектаў Вікімедыя" to "спіс артыкулаў у адным з праектаў Вікімедыя" in all the articles. A letter "р" was missed (2021-02-23)[edit]

Request date: 23 February 2021, by: Belarus2578

Link to discussions justifying the request

There is not discussion. There is only obvious mysprint. --Belarus2578 (talk) 05:01, 25 February 2021 (UTC)

Task description

Please, change Belarusian language description from "спіс атыкулаў у адным з праектаў Вікімедыя" to "спіс артыкулаў у адным з праектаў Вікімедыя" in all the articles. A letter "р" was missed. --Belarus2578 (talk) 06:47, 23 February 2021 (UTC)

Discussion
Pictogram voting comment.svg Comment There are over 250,000 items. --Matěj Suchánek (talk) 10:15, 13 March 2021 (UTC)
I would like to tackle this, do this still need community discussion? Ammarpad (talk) 14:07, 8 June 2021 (UTC)
@Ammarpad: I don't think you need full community discussion, but an independent confirmation by one more Belarusian speaker would be nice. @EugeneZelenko, Liashko, Хомелка: can you please confirm this? Vojtěch Dostál (talk) 12:38, 21 June 2021 (UTC)
@Vojtěch Dostál: Good. If any of them can confirm, I would do it then. Ammarpad (talk) 13:42, 21 June 2021 (UTC)
Yes, request is reasonable and new title is correct. --EugeneZelenko (talk) 14:13, 21 June 2021 (UTC)
Request process

request to .. (2021-03-14)[edit]

Request date: 14 March 2021, by: Mikey641

Link to discussions justifying the request

Many links in Museum of the Jewish People at Beit Hatfutsot ID (P9280), also in next section.

Task description

Hey. I would love to get some help with importing the id's of [1] to Museum of the Jewish People at Beit Hatfutsot ID (P9280).
The structure of the Url is https://dbs.anumuseum.org.il/skn/en/e256696
Basically it could describe anything - a country/person/choir/place.
If you use https://dbs.anumuseum.org.il/skn/en/e211457 - The url of South Africa, when pasted it will turn into https://dbs.anumuseum.org.il/skn/en/c6/e211457/Place/South_Africa
Therefore we can differentiate between Place, Family_Name, Personalities.
I have not found a tabular source for it, therefore this task requires a bot. And my programming is not sufficient.
I would suggest going through all urls and matching the Label to a wikidata Item, while differentiating between same labels using the URL description of what entity, comparing it to instance of (P31)--Mikey641 (talk) 17:04, 14 March 2021 (UTC)

Discussion


Request process

request to unweave former Dutch municipalities from their eponymous capital (2021-03-31)[edit]

Request date: 2 April 2021, by: 1Veertje

Link to discussions justifying the request
Task description

As per this query there are currently 1115 human settlements that have their data mixed up with their eponymous former municipality. Unweaving this is quite tricky since it's hard to preserve references. I think these statements need to be moved to a new item:

It can be assumed that:

The original item should have the statement added that:

Further items that need adjusting:

I'm not sure about whether or not these are more appropriate for an item about the municipality:

Discussion

@Multichill, Antoni1626: Is there a way of moving this data arround that preserves the references? The population statistics should also get the qualifier publisher (P123) moved to the references. --1Veertje (talk) 11:29, 2 April 2021 (UTC)

For your request above, maybe you could create the new items and then produce a query of what needs to be moved. --- Jura 12:33, 2 April 2021 (UTC)
  • Symbol oppose vote oversat.svg Strong oppose doing this by hand, let alone by bot. What would be the point of splitting up for example Bennebroek (Q817840)? Multichill (talk) 16:55, 2 April 2021 (UTC)
    the role and area of a municipality is very different from a town. The statistics are recorded at the municipal level, but the area relevant to it is very different. I grew up in the small municipality of Wateringen, which covered the area of the eponymous village and the village of Kwintsheul. Using one and the same item to describe Kwintsheul's jurisdictional history is very ugly. Documenting a municipality like Bergeyk/Bergeijk changing its size over time is hard enough without it also needing to serve the role of its eponymous village. It's incomprehensible to me that you modeled the data in this way. 1Veertje (talk) 18:00, 2 April 2021 (UTC)
    The world is not black and white. Trying to model it like that won't work either. Might be hard to understand. The municipal status is just something that got assigned at some point to cities, villages and heerlijkheiden. In some cases it might make sense to split, in some cases it doesn't make sense. Mass splitting is not the solution. Multichill (talk) 09:39, 3 April 2021 (UTC)
    The data relevant to the municipality gets snowed over by its pseudonymous village if it doesn't get its own item once it gets superseded by another municipality. A simple query for municipalities in ZH doesn't show an item like Rhoon (Q687584) because the municipality of the Netherlands (Q2039348) in the P31 is the only one that doesn't have a preferred statement. You can't give the main item a dissolved, abolished or demolished date (P576) property because things are mixed up in each other and you can't link to it from the new municipality to the old with replaces (P1365). It's jurisdictional history can't very well reference itself. Assuming that just because there is an eponymous village the size would be about the same was a wrong assumption to make, like with my example of Wateringen. So far I've encountered no problems strictly following the set of instructions written out above. --1Veertje (talk) 09:59, 4 April 2021 (UTC)
  • Maybe you want to sort this out on Wikidata:Project_chat or Wikidata:De_kroeg and then make a request. --- Jura 20:25, 2 April 2021 (UTC)


Request process

request to change "instance of" on some Q-items (2021-04-09)[edit]

Request date: 9 April 2021, by: Taylor 49

Link to discussions justifying the request
[3] probably uncontroversial, nobody answered
Task description
change "instance of"
  • Help:Contents (Q914807) -> Wikimedia help page (Q56005592)
  • Appendix:TOC (Q35243371) -> Wikimedia appendix namespace page (Q101043034)

Nothing should be "instance of" TOC/Index/Contents. A bot should change all "instance of" Help:Contents (Q914807) to Wikimedia help page (Q56005592) and all "instance of" Appendix:TOC (Q35243371) to Wikimedia appendix namespace page (Q101043034).

Proposer: Taylor 49 (talk) 13:29, 9 April 2021 (UTC)

Request process

request to uprank current existing countries (2021-04-10)[edit]

Request date: 10 April 2021, by: Bouzinac

Link to discussions justifying the request
Task description

Help clean P17 data by:

Exemple : Q2492784#P17 --> Ukraine (Q212) [which does not have any P576] + Soviet Union (Q15180) [which has a P576] ==>Ukraine (Q212) should be upranked

Discussion
Request process

import writers/screenwriters (one time data import)[edit]

When adding values for screenwriter (P58), I notice that frequently these persons don't have Wikidata items yet.

It would be helpful to identify a few sources for these and create corresponding items. Ideally every tv episode would have its writers included. --- Jura 15:05, 18 November 2018 (UTC)

It would be beneficial if informations like if the writer wrote just the teleplay or the story would be stated.--CENNOXX (talk) 07:19, 12 April 2019 (UTC)
  • At this stage, the idea is to simply create items for writers, not adding them to works. --- Jura 12:26, 19 July 2019 (UTC)
  • Would be helpful for WP Movies. --- Jura 21:19, 26 March 2020 (UTC)
  • If these are created for TV series (for which we might not have items for every episode), the series could be mentioned with contributed to creative work (P3919). Creating them beforehand makes it easier to add them to episodes once they are created. --- Jura 09:31, 14 April 2021 (UTC)
I'm not sure if there is a good free source on screenwriters somewhere. Perhaps as an alternative we can collect red links of the corresponding infobox parameter on the English Wikipedia and create items for them? —putnik 09:40, 14 April 2021 (UTC)
Yes, episode lists and season articles could be used to create them. --- Jura 09:46, 14 April 2021 (UTC)

ValterVB LydiaPintscher Ermanon Cbrown1023 Discoveranjali Mushroom Queryzo Danrok Rogi Mbch331 Jura Jobu0101 Jklamo Jon Harald Søby putnik ohmyerica AmaryllisGardener FShbib Andreasmperu Li Song Tiot Harshrathod50 U+1F350 Bodhisattwa Shisma Wolverène Tris T7 Antoine2711 Hrk6626 TheFireBender V!v£ l@ Rosière WatchMeWiki! CptViraj ʂɤɲ Trivialist 2le2im-bdc Sotiale Wallacegromit1, mostly focus on media historiography and works from the Global South Floyd-out M2k~dewiki Rockpeterson Mathieu Kappler Sidohayder Spinster Gnoeee Ranjithsiji Pictogram voting comment.svg Notified participants of WikiProject Movies --- Jura 09:32, 14 April 2021 (UTC)

Maybe we can use IMDb Datasets with writer information (see title.crew.tsv.gz). Queryzo (talk) 14:25, 14 April 2021 (UTC)

Note that IMDB isn't counted as a reliable reference for enwp (since it is user-generated), so a different source would be better if possible (and references added during import!). Thanks. Mike Peel (talk) 06:56, 16 April 2021 (UTC)
  • Wikidata isn't enwiki. Generally in this field for this type of information, IMDb is highly regarded. As for any statement, several references can be useful. --- Jura 07:38, 16 April 2021 (UTC)
  • It could be interesting to do a one-time import and try to extract additional information from Wikipedia for current series/seasons on a regular basis. --- Jura 07:38, 16 April 2021 (UTC)

Add original title of scientific articles (data import/cleanup)[edit]

There are some articles, that have title (P1476) value enclosed in square bracket. This means that the title is translated to English and the article's title wasn't in English.

Sample: https://www.wikidata.org/w/index.php?title=Q27687073&oldid=555470366

Generally, the following should be done:

  1. deprecate existing P1476 statement
  2. add the original title with title (P1476)
  3. add the label in the original language
  4. remove [] from the English label

--- Jura 11:03, 11 December 2018 (UTC)

Research_Bot claims to do this under Maintenance Queries but I still see a lot of research papers with this issue. I might work on a script for this to try and figure out how to make a bot. Notme1560 (talk) 18:17, 21 March 2019 (UTC)
I have created a script for this task. source and permission request --Notme1560 (talk) 20:39, 23 March 2019 (UTC)
  • It seems there may be some 5 out of 220 needing this. --- Jura 17:22, 26 August 2019 (UTC)
  • Would still be worthwhile. --- Jura 21:19, 26 March 2020 (UTC)
  • --- Jura 09:34, 14 April 2021 (UTC)

Fix capitalization and grammar of Bosnian labels (2021-04-14)[edit]

Request date: 14 April 2021, by: Srđan

Link to discussions justifying the request
Task description

See: quarry:query/54093

Could you run the query once more? As it should show now a lot less then the 418824 items of April 14th. Edoderoo (talk) 15:05, 2 May 2021 (UTC)
@Edoderoo:: Sorry for the late reply. Just re-ran the query and it's sitting at 224,889 items. Definitely fewer than before, but still a lot to go. – Srđan (talk) 16:13, 8 May 2021 (UTC)
Update: Resultset (141.438 rows) Edoderoo (talk) 14:57, 21 May 2021 (UTC)
The wikidata-queries were empty, but the quarry had still some left, those are now in process. Almost finished ;-) Edoderoo (talk) 15:00, 8 June 2021 (UTC)

These are the descriptions that show be written in lowercase and slightly altered:

Srđan (talk) 08:37, 30 April 2021 (UTC)

Licence of data to import (if relevant)
Discussion
  • Here is a query: [4]. Maybe check if any bots still add more of them. --- Jura 09:55, 14 April 2021 (UTC)
Request process

Accepted by (Edoderoo (talk) 15:05, 2 May 2021 (UTC)) and under process

Task completed

There are some items left, those are not true Wikimedia Categories, but special categories. I solved the ones with many items in them (like the category for stubs) but there are too many with just a few entries in the end. 99% of the request is done.

request to add Property:P9382 (2021-04-16)[edit]

Request date: 16 April 2021, by: 217.117.125.72

Task description

For all items with Unicode hex codepoint (P4213) add Unicode character name (P9382). Good source of data is official site of Unicode. I think that Unicode character (P487) can be used to add Unicode hex codepoint (P4213) and Unicode character name (P9382) if item has it but hasn’t other two properties. 217.117.125.72 08:28, 16 April 2021 (UTC)

Discussion


Request process

request to .. (2021-04-19)[edit]

Request date: 19 April 2021, by: Powell Street Festival Society

Link to discussions justifying the request

I have been tasked by the Powell Street Festival Society to upload to Wikidata a listing of Japanese-Canadian Artist information from the Japanese-Canadian Artists Directory.

Task description

I have worked through the various Wikidata steps to prepare the data to be imported. The data is in an Excel spreadsheet. It appears that I am on Step 6. I can provide a sample file with column headers to check that I have parsed the data properly.

Thank you for your attention with this request. I look forward to your response.

Regards, Michael

Licence of data to import (if relevant)
Discussion

@Powell Street Festival Society: What list of steps are you following? You don't necessarily need a bot to do this import. BrokenSegue (talk) 17:35, 19 April 2021 (UTC)

Hello BrokenSegue I am new to this process (and to Wikidata) and have been following the steps in the "Data Import Guide", I have created the "Dataset Summary" and it appears I am on Step 7: Match the data to Wikidata (Option 2: self import). I could really use some help to figure this out. I am not even sure if I am replying properly :)

Request process

September 17, 2021

Discussion

BrokenSegue if you are available, I still need help with the steps to properly upload the Excel data to Wikidata. As mentioned I am currently on step 6 of the "Data Import Guide" and not sure how to proceed. Any assistance you could provide would be greatly appreciated. Regards,

Michael

request to import data for "Cheung Chau Piu Sik Parade" (2021-05-06)[edit]

Request date: 6 May 2021, by: Hkbulibdmss

Link to discussions justifying the request
Task description

https://www.wikidata.org/wiki/Wikidata:Dataset_Imports/Cheung_Chau_Piu_Sik_Parade

Please help to import the dataset. The URL of a spreadsheet is : https://docs.google.com/spreadsheets/d/1iUVrHNsXVmn94IygtZYj0-foeUg9yvdOcwQ_V-CQbto/edit?usp=sharing

Licence of data to import (if relevant)
Discussion


Request process

request to fix parliamentary group = caucus, != party (2021-05-12)[edit]

Request date: 12 May 2021, by: Jura1

Link to discussions justifying the request
Task description
Discussion


Request process

request to automate marking preferred_rank for full dates. (2021-05-28)[edit]

Request date: 28 May 2021, by: Richard Arthur Norton (1958- )

Task description

We have year only dates and full dates for date_of_birth and date_of_death. See for instance Eliot Blackwelder (Q16785350). We need to mark the full date as "preferred rank" and add in the reason_for_preferred_rank=most complete record (Q105749746). The problem is when we have two dates of equal rank, both display in infoboxes. --RAN (talk) 04:45, 28 May 2021 (UTC)

Discussion
Request process

@Richard Arthur Norton (1958- ): What about references though? What if the less complete date has a reference and the other does not? Should we still do this? I might be able to find time to do this. BrokenSegue (talk) 05:21, 28 May 2021 (UTC)

I guess in the case where the two dates disagree we should not perform the update. BrokenSegue (talk) 05:22, 28 May 2021 (UTC)
  • I think it's already being done by @Matěj Suchánek:, if references are present.--- Jura 07:07, 28 May 2021 (UTC)
That would be great, I haven't seen the bot in action yet, I am still plugging away by hand as I come across them. --RAN (talk) 20:20, 28 May 2021 (UTC)
No, my bot does not manipulate ranks. --Matěj Suchánek (talk) 11:52, 29 May 2021 (UTC)
  • Maybe it was someone else's. Sorry then. --- Jura 11:59, 29 May 2021 (UTC)
  • @Matěj Suchánek: I think I had this in mind. --- Jura 09:33, 30 May 2021 (UTC)
    Indeed, my bot still does that (every Wednesday). In fact, it has evolved since, it also merges (seemingly) duplicate dates (that issue with -00-00 vs. -01-01 etc.). But it does not change ranks, and it even avoids statements with non-normal rank. --Matěj Suchánek (talk) 10:26, 30 May 2021 (UTC)

@Matěj Suchánek: Are you interested in picking this task up? It does kinda overlap with the task Jura mentioned. Actually, hmm, there is some subtlety here that I can see being tricky (multiple dates with different qualifiers sometimes shouldn't be merged e.g. for start time (P580)s with a applies to part (P518)). If not I may still do it. BrokenSegue (talk) 12:40, 30 May 2021 (UTC)

Sorry, I am not right now. I guess it's easy now that we have Ranker (Q105394978), which can be driven by SPARQL. (Or maybe not that easy if the qualifier is also required, but QS can do this part.) I made up a query which can be used as basis.
What if the less complete date has a reference and the other does not? Preferred statements should always be sourced. If there is no evidence for the more precise date, it should be either removed or sourced (and then up-rank'd). --Matěj Suchánek (talk) 13:12, 30 May 2021 (UTC)
Thanks for the query; you're a SPARQL wizard. I write my bot actions self-contained in python so I don't need ranker. BrokenSegue (talk) 14:07, 30 May 2021 (UTC)
Excellent! I know there are several bots trying to fill in references for dates, but they are mostly pulling data from sources that give year-only dates. At one time I calculated that about 20% of year-only dates are off by a year because they are back calculated from the age at death in an obituary. --RAN (talk) 00:37, 1 June 2021 (UTC)
Do you know who is operating these bots? Wikibase in theory supports adding uncertainty in dates but in practice I believe the correct way to add a date with that kind of uncertainty is to use e.g. earliest date (P1319). BrokenSegue (talk) 01:31, 1 June 2021 (UTC)

request to replace qualifiers in GND ID (2021-06-07)[edit]

Request date: 7 June 2021, by: Kolja21

Link to discussions justifying the request
Task description

Please replace in GND ID stated as (P1932) with named as (P1810)

  1. GND ID (P227) delete qualifier stated as (P1932)
  2. import name of object from GND with qualifier named as (P1810)
  3. add retrieved (P813)

Scope: 5.161 qualifiers stated as (P1932), see Wikidata:Database reports/Constraint violations/P227#Properties statistics.

Pictogram voting comment.svg Comment (in German): Man könnte hinzufügen, dass man über die OpenRefine Reconciliation oder über https://d-nb.info/gnd/100045642/about/lds.ttl (gndo:preferredNameForThePerson) recht einfach und schnell die aktuelle Version abfragen kann. (User:Emu)

Example
Discussion
Request process

Accepted by (Ammarpad (talk) 14:01, 10 June 2021 (UTC)) and under process

Fix values of P248 in references (2021-06-13)[edit]

Request date: 13 June 2021, by: Epìdosis

Link to discussions justifying the request
Task description

@Ladsgroup: In order to respect more strictly the value-type constraint (Q21510865) of stated in (P248), the following substitutions in references (unfortunately not doable through QS) are needed. --Epìdosis 20:08, 13 June 2021 (UTC)

Note: Before the first substitution (regarding Accademia delle Scienze di Torino ID (P8153)), #Accademia delle Scienze di Torino multiple references should be solved.

Query Remove Add
Sparql output (query) stated in (P248) Academy of Sciences of Turin (Q2822396) stated in (P248) www.accademiadellescienze.it (Q107212659)
Sparql output (query) stated in (P248) Accademia della Crusca (Q338489) stated in (P248) Catalogo degli accademici della Crusca (Q107212594)
Sparql output (query) stated in (P248) Académie Nationale de Médecine (Q337555) stated in (P248) www.academie-medecine.fr (Q107213173)
Sparql output (query) stated in (P248) Hungarian Academy of Sciences (Q265058) stated in (P248) mta.hu (Q107213548)
Sparql output (query) stated in (P248) Montpellier Academy of Sciences and Letters (Q2822394) stated in (P248) Montpellier Academy of Sciences and Letters (Q107214470)
Sparql output (query) stated in (P248) Académie des sciences d'outre-mer (Q337512) stated in (P248) www.academieoutremer.fr (Q107214481)
Sparql output (query) stated in (P248) Académie Française (Q161806) stated in (P248) www.academie-francaise.fr (Q107214508)
Sparql output (query) stated in (P248) Académie Française (Q161806) stated in (P248) www.academie-francaise.fr (Q107214508)
Sparql output (query) stated in (P248) Comité des travaux historiques et scientifiques (Q2985434) stated in (P248) annuaire prosopographique: la France savante (Q55740543)
Sparql output (query) stated in (P248) Académie des Sciences Morales et Politiques (Q337543) stated in (P248) academiesciencesmoralesetpolitiques.fr (Q107214612)
Sparql output (query) stated in (P248) Académie des sciences morales, des lettres et des arts de Versailles (Q2822337) stated in (P248) www.academiedeversailles.com (Q107214646)
Discussion

@Sapfan: Maybe we could add some more to the table? (Czech archives) Vojtěch Dostál (talk) 19:46, 14 June 2021 (UTC)

Thanks for letting me know! See below my list (copied from an earlier request which is still open) - can we add it? Unfortunately, there is an extra "IF", because I cannot guarantee that all the archive references should point to vital records.
If stated in (P248) is and title (P1476) begins with then change stated in (P248) to
Prague City Archives (Q19672898) Archiv hl. m. Prahy, Matrika Collection of Registry Books at Prague City Archives (Q105319160)
Prague City Archives (Q19672898) Archiv hl. m. Prahy, Soupis pražských List of residents in Prague 1830-1910 (1920) (Q105322358)
Moravian regional archive (Q12038677) Moravský zemský archiv, Matrika Collection of Registry Books at Moravian Regional Archive (Q102116996)
Státní oblastní archiv v Litoměřicích (Q18920590) SOA Litoměřice, Matrika Collection of Registry Books at Litoměřice State Archive (Q105319095)
Regional state archive in Pilsen (Q21953079) SOA Plzeň, Matrika Collection of Registry Books at Pilsen State Archive (Q105319092)
Státní oblastní archiv v Praze (Q12056840) SOA Praha, Matrika Collection of Registry Books at Prague State Archive (Q105319086)
Regional State Archives in Třeboň (Q12056841) SOA Třeboň, Matrika Collection of Registry Books at Třeboň State Archive (Q105319089)
Státní oblastní archiv v Zámrsku (Q17156873) SOA Zámrsk, Matrika Collection of Registry Books at Zámrsk State Archive (Q105319097)
Zemský archiv v Opavě (Q10860553) Zemský archiv v Opavě, Matrika Collection of Registry Books at Opava Regional Archive (Q105319099)
Museum of Czech Literature (Q5979897) Kartotéka Jaroslava Kunce Kunc Jaroslav (Q82329263)

Thanks in advance! --Sapfan (talk) 20:07, 14 June 2021 (UTC)

@Epìdosis@Sapfan@Vojtěch Dostál I started the first table, the second one is not that complicated. I will do that too. It'll just take some time. Amir (talk) 09:35, 19 June 2021 (UTC)

@Ladsgroup: Another one just emerged. Thanks, --Epìdosis 15:30, 25 June 2021 (UTC)

I continue adding others in the table below. --Epìdosis 12:28, 26 June 2021 (UTC)
@Epìdosis Started the clean up. Amir (talk) 15:54, 26 June 2021 (UTC)
Query Remove Add
Sparql output (query) stated in (P248) Munzinger-Archiv (Q974352) stated in (P248) Munzinger Personen (Q107343683)
Sparql output (query) stated in (P248) National Library of Australia (Q623578) stated in (P248) Trove (Q18609226)
Sparql output (query) stated in (P248) Museo Galileo (Q1668196) stated in (P248) Q105263158
Sparql output (query) stated in (P248) Bibliothèque interuniversitaire de Santé (Q867925) stated in (P248) Base biographique (Q105958830)
Sparql output (query) stated in (P248) Nationalmuseum (Q842858) stated in (P248) Catalogue of the Nationalmuseum (Q107354917)
Sparql output (query) stated in (P248) Biblioteca Nacional de España (Q750403) stated in (P248) datos.bne.es (Q50358336)
Request process
Epìdosis All of your cases are done, unless they grew after fixing. @Sapfan @Vojtěch Dostál: I started the bot to fix your cases some examples Amir (talk) 20:02, 31 July 2021 (UTC)
Hi @Amir! Thanks for picking it up and starting the changes. I wanted to give thumbs up (because it mostly looks good), but please look at Eduard Ringelsberg (Q98736213). You have also converted Prague City Archives (Q19672898) into Collection of Registry Books at Prague City Archives (Q105319160) where the target should have been List of residents in Prague 1830-1910 (1920) (Q105322358), namely under sibling (P3373). The difference between the two is in the first part of the title (P1476), as written in the table. Also, you will see under the same property, that some of the Q have not been changed. Can you please fix this part of the logic? Thanks! --Sapfan (talk) 20:18, 31 July 2021 (UTC)

request to cleanup DOI only items (2021-07-04)[edit]

Request date: 4 July 2021, by: Jura1

Task description

Items like Q57554778 consist mainly of DOI: the DOI is repeated as title and label.


@Daniel Mietchen: who created some or all of them. @Trilotat: who mentioned some on Wikidata:Request_a_query#Items_with_DOI_(P356)_that_start_with_10.1023/A:_without_a_Label_or_a_title_(P1476). --- Jura 13:24, 4 July 2021 (UTC)

@Jura1: To be precise, I was looking for items without a label, but I had seen this and did some research. A web search for any of the "DOI as title" DOIs will find that they are all or almost all noted in ResearchGate publication ID (P5875) items associated with Entomologia Experimentalis et Applicata (Q15753202) journal. These items are published in (P1433) CrossRef Listing of Deleted DOIs (Q53952674).
  • Q57554778 is 10.1023/A:1003902321787 and that DOI is mentioned in ResearchGate publication ID (P5875) 226608108. That researchgate item mentions the title and article details as Q107413498.
  • I added the deleted DOI to that matched item as deprecated (as withdrawn identifier value).
  • They should be merged, but I didn't as I thought it might confuse this bot request.
In the future, I think we can add the new DOI to the bad items and then rerun SourceMD as I did with Q57030816, right? Trilotat (talk) 14:54, 4 July 2021 (UTC)
List of items: User:Jura1/DOI as label. It was done using regexp 10\..+/ for title (P1476) values. — Ivan A. Krestinin (talk) 20:15, 21 July 2021 (UTC)
Request process

request to add reference (2021-07-04)[edit]

Request date: 4 July 2021, by: Data Gamer

Link to discussions justifying the request
Task description

Hello. In all items (56 items) that have position held (P39) -> member of the House of Representatives of Cyprus (Q19801674) with qualifier parliamentary term (P2937) -> 12th Cypriot Parliament (Q107003549)

I want to add the reference to above statement:

reference URL (P854) -> http://www.parliament.cy/el/general-information/%CE%B2%CE%BF%CF%85%CE%BB%CE%B5%CF%85%CF%84%CE%B9%CE%BA%CE%AD%CF%82-%CE%B5%CE%BA%CE%BB%CE%BF%CE%B3%CE%AD%CF%82/%CE%B5%CE%BA%CE%BB%CE%BF%CE%B3%CE%AD%CF%82-30%CE%AE%CF%82-%CE%BC%CE%B1%CE%90%CE%BF%CF%85-2021

title (P1476) -> Εκλογές 30ής Μαΐου 2021 (in Greek (el) Language)

retrieved (P813) -> 2021-07-04

archive URL (P1065) -> https://web.archive.org/web/20210704152630/http://www.parliament.cy/el/general-information/%CE%B2%CE%BF%CF%85%CE%BB%CE%B5%CF%85%CF%84%CE%B9%CE%BA%CE%AD%CF%82-%CE%B5%CE%BA%CE%BB%CE%BF%CE%B3%CE%AD%CF%82/%CE%B5%CE%BA%CE%BB%CE%BF%CE%B3%CE%AD%CF%82-30%CE%AE%CF%82-%CE%BC%CE%B1%CE%90%CE%BF%CF%85-2021

archive URL (P1065) -> https://archive.is/loRfw

archive date (P2960) -> 2021-07-04

language of work or name (P407) -> Greek (Q9129)

publisher (P123) -> House of Representatives (Q1112381)

Thanks.

Licence of data to import (if relevant)
Discussion


Request process

Proliferate external-IDs from qualifiers and references to main statement (2021-07-06)[edit]

Request date: 6 July 2021, by: Vladimir Alexiev

Link to discussions justifying the request
Task description

Take a prop like ORCID: Property_talk:P496 says that 22.7% of uses are as reference, and 0.1% as qualifier.

I bet that some of those uses are not reflected as main statement.

SELECT ?itemLabel ?wdt ?wdLabel ?id { # ?ref ?wdr ?statement {
  ?wd wikibase:propertyType wikibase:ExternalId; wikibase:directClaim ?wdt; wikibase:reference ?wdr.
  ?ref ?wdr ?id.
  ?statement prov:wasDerivedFrom ?ref.
  # ?item ?prop ?statement
  # filter not exists {?item ?wdt ?id}
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". }
} limit 10

Try it!

Of course, sifting through all those external-IDs used as refs will be a huge task. WD times out even on a count query:

SELECT (count(*) as ?c) {
  ?wd wikibase:propertyType wikibase:ExternalId; wikibase:reference ?wdr.
  ?ref ?wdr ?id.
}

Try it!

Discussion


Request process

Request to change lexeme forms' grammatical features (2021-07-08)[edit]

Request date: 8 July 2021, by: Bennylin

Link to discussions justifying the request
Task description

How can I change grammatical features of form? (I operate bot, I just need to know the commands). I have the list of lexemes. I reckon this should be not too hard, I'm just not familiar with the command to do the changes.

Licence of data to import (if relevant)
Discussion


Request process

request to update Template:Tr langcodes counts monolingual text periodically (2021-07-14)[edit]

Request date: 14 July 2021, by: Jura1

Task description

--- Jura 09:50, 14 July 2021 (UTC)

Request process

Help Bota .. (2021-07-27)[edit]

Request date: 27 July 2021, by: Takhirgeran Umar

Link to discussions justifying the request
Task description
Licence of data to import (if relevant)
Discussion
Pictogram voting comment.svg Comment There are around 120,000 changes. --Matěj Suchánek (talk) 16:20, 6 August 2021 (UTC)
to clarify you want all items with that description replaced with that other description? Is there discussion around this? I can do it easily but no idea if this is an "Accepted" change. BrokenSegue (talk) 19:37, 15 August 2021 (UTC)
Request process

request to Change References to Qualifiers on property P2949 (2021-08-07)[edit]

Request date: 7 August 2021, by: Lesko987a

Link to discussions justifying the request
Task description
  • I was adding properties named as (P1810) and retrieved (P813) as a reference to the WikiTree person ID (P2949) identifier. User Jura1 suggested to change it from a reference to a qualifier. And I agree. It was done on almost all P2949 properties (210K out of 215K). That is a lot of changes to do and Jura suggested you can do it much faster. I would need to delete all references added and then add them as qualifiers.
Licence of data to import (if relevant)
Discussion


Request process
@Pasleim: may help.--GZWDer (talk) 17:22, 8 August 2021 (UTC)

Adding ja.wikinews category link (2021-08-08)[edit]

Request date: 8 August 2021, by: Mario1257

Link to discussions justifying the request
none
Task description
Wikidata has items by date, example:August 9, 2021 (Q69306139). Category in ja.wikinews by recently created is not tied to Wikidata. Please link individual items from April 21st to September 30th, 2021 the categories.
Discussion
Request process

Merge multiple P26 statements[edit]

Occasionally, I come across items with multiple spouse (P26) statements for the same spouse, but no diverging or conflicting qualifiers. I think these should be merged.

Sample: [6]. --- Jura 13:36, 28 August 2021 (UTC)

if there's two statements and one has a qualifier (say start date) and the other doesn't is merging them correct? or should we mark one as preferred? BrokenSegue (talk) 14:23, 28 August 2021 (UTC)
I'd think so, thus the merge request. Q34851#P26 should have at least two statements with Richard Burton (Q151973) as value. --- Jura 15:47, 28 August 2021 (UTC)


SELECT * 
{
  ?item p:P26 ?st . 
  ?st prov:wasDerivedFrom / pr:P248 wd:Q75653886 .
  ?st ps:P26 ?spouse .
  ?st2 ps:P26 ?spouse .
  ?item p:P26 ?st2 .
  FILTER( ?st != ?st2 ) 
  FILTER NOT EXISTS { ?st wikibase:rank wikibase:DeprecatedRank }
  FILTER NOT EXISTS { ?st2 wikibase:rank wikibase:DeprecatedRank }  
  OPTIONAL { ?st pq:P1545 ?ord }
  OPTIONAL { ?st2 pq:P1545 ?ord2 }
}

Try it!

The above currently finds some 3953 items, possibly some more filtering should be done. --- Jura 12:02, 4 September 2021 (UTC)

The first one I checked, has indeed double data but different sources. So within a cleanup action we should be sure not to loose any sourced info. Edoderoo (talk) 20:12, 19 September 2021 (UTC)

request to add YOB and/or YOD to TP descriptions (2021-09-01)[edit]

Request date: 1 September 2021, by: Jura1

Task description

Many TP imported items have a description in the form "Peerage person ID=\d*". These were added when these items didn't include more information.

In the meantime, some of these items include date of birth (P569) and/or date of death (P570). To make it easier to identify them, the years from these dates should be added to the description.

  • Sample edit: [7].
  • Query to find items (currently 28776):
SELECT DISTINCT ?item ?itemLabel ?d
{
  hint:Query hint:optimizer "None".
  ?item wdt:P4638 [] .
  ?item (wdt:P569|wdt:P570) [] .
  ?item schema:description ?d . 
  FILTER( lang(?d) && regex (?d, "^Peerage person ID=\\d+$") ) 
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". }
}

Try it!

Thanks --- Jura 12:44, 1 September 2021 (UTC)

Discussion

I had a look to some of them, but the data is quite messy, due to a source that is messy too. I do not see a good reason to have this data in WikiData, the only reason it got imported is because it was there. Any effort in describing this, will not make a lesser mess. Edoderoo (talk) 20:09, 19 September 2021 (UTC)

  • I thought I was the only one who wasn't really convinced by the import. It has some merits and has gotten a bit more useful since. If we don't want to delete some of it, we should at least try to normalize its labels and descriptions. --- Jura 21:37, 19 September 2021 (UTC)
Request process

Parts for duos[edit]

A while back, we generated missing parts for duos. Each duo would generally have one item for each member. This finds some that lack parts. Maybe some more filtering needs to be done.

Sample items: Q6161933, Q52375494.

For a list, see Wikidata:WikiProject Q5/lists/duos.

Previous request: Wikidata:Bot_requests/Archive/2016/12#duos_without_parts. @Matěj Suchánek: --- Jura 14:56, 14 September 2021 (UTC)