.oO SearXNG Developer Documentation Oo.
Loading...
Searching...
No Matches
searx.engines.wikidata Namespace Reference

Classes

class  WDAmountAttribute
 
class  WDArticle
 
class  WDAttribute
 
class  WDDateAttribute
 
class  WDGeoAttribute
 
class  WDImageAttribute
 
class  WDLabelAttribute
 
class  WDURLAttribute
 

Functions

 get_headers ()
 
 get_label_for_entity (entity_id, language)
 
 send_wikidata_query (query, method='GET')
 
 request (query, params)
 
 response (resp)
 
 get_thumbnail (img_src)
 
 get_results (attribute_result, attributes, language)
 
 get_query (query, language)
 
 get_attributes (language)
 
 debug_explain_wikidata_query (query, method='GET')
 
 init (engine_settings=None)
 
 fetch_traits (EngineTraits engine_traits)
 

Variables

logging logger .Logger
 
dict about
 
list display_type = ["infobox"]
 
str SPARQL_ENDPOINT_URL = 'https://query.wikidata.org/sparql'
 
str SPARQL_EXPLAIN_URL = 'https://query.wikidata.org/bigdata/namespace/wdq/sparql?explain'
 
dict WIKIDATA_PROPERTIES
 
str QUERY_TEMPLATE
 
str QUERY_PROPERTY_NAMES
 
 DUMMY_ENTITY_URLS
 
 sparql_string_escape
 
 replace_http_by_https = get_string_replaces_function({'http:': 'https:'})
 
str _IMG_SRC_DEFAULT_URL_PREFIX = "https://commons.wikimedia.org/wiki/Special:FilePath/"
 
str _IMG_SRC_NEW_URL_PREFIX = "https://upload.wikimedia.org/wikipedia/commons/thumb/"
 

Detailed Description

This module implements the Wikidata engine.  Some implementations are shared
from :ref:`wikipedia engine`.

Function Documentation

◆ debug_explain_wikidata_query()

searx.engines.wikidata.debug_explain_wikidata_query ( query,
method = 'GET' )

Definition at line 754 of file wikidata.py.

754def debug_explain_wikidata_query(query, method='GET'):
755 if method == 'GET':
756 http_response = get(SPARQL_EXPLAIN_URL + '&' + urlencode({'query': query}), headers=get_headers())
757 else:
758 http_response = post(SPARQL_EXPLAIN_URL, data={'query': query}, headers=get_headers())
759 http_response.raise_for_status()
760 return http_response.content
761
762

References searx.engines.wikidata.get_headers().

+ Here is the call graph for this function:

◆ fetch_traits()

searx.engines.wikidata.fetch_traits ( EngineTraits engine_traits)
Uses languages evaluated from :py:obj:`wikipedia.fetch_wikimedia_traits
<searx.engines.wikipedia.fetch_wikimedia_traits>` and removes

- ``traits.custom['wiki_netloc']``: wikidata does not have net-locations for
  the languages and the list of all

- ``traits.custom['WIKIPEDIA_LANGUAGES']``: not used in the wikipedia engine

Definition at line 783 of file wikidata.py.

783def fetch_traits(engine_traits: EngineTraits):
784 """Uses languages evaluated from :py:obj:`wikipedia.fetch_wikimedia_traits
785 <searx.engines.wikipedia.fetch_wikimedia_traits>` and removes
786
787 - ``traits.custom['wiki_netloc']``: wikidata does not have net-locations for
788 the languages and the list of all
789
790 - ``traits.custom['WIKIPEDIA_LANGUAGES']``: not used in the wikipedia engine
791
792 """
793
794 fetch_wikimedia_traits(engine_traits)
795 engine_traits.custom['wiki_netloc'] = {}
796 engine_traits.custom['WIKIPEDIA_LANGUAGES'] = []

◆ get_attributes()

searx.engines.wikidata.get_attributes ( language)

Definition at line 353 of file wikidata.py.

353def get_attributes(language):
354 # pylint: disable=too-many-statements
355 attributes = []
356
357 def add_value(name):
358 attributes.append(WDAttribute(name))
359
360 def add_amount(name):
361 attributes.append(WDAmountAttribute(name))
362
363 def add_label(name):
364 attributes.append(WDLabelAttribute(name))
365
366 def add_url(name, url_id=None, **kwargs):
367 attributes.append(WDURLAttribute(name, url_id, kwargs))
368
369 def add_image(name, url_id=None, priority=1):
370 attributes.append(WDImageAttribute(name, url_id, priority))
371
372 def add_date(name):
373 attributes.append(WDDateAttribute(name))
374
375 # Dates
376 for p in [
377 'P571', # inception date
378 'P576', # dissolution date
379 'P580', # start date
380 'P582', # end date
381 'P569', # date of birth
382 'P570', # date of death
383 'P619', # date of spacecraft launch
384 'P620',
385 ]: # date of spacecraft landing
386 add_date(p)
387
388 for p in [
389 'P27', # country of citizenship
390 'P495', # country of origin
391 'P17', # country
392 'P159',
393 ]: # headquarters location
394 add_label(p)
395
396 # Places
397 for p in [
398 'P36', # capital
399 'P35', # head of state
400 'P6', # head of government
401 'P122', # basic form of government
402 'P37',
403 ]: # official language
404 add_label(p)
405
406 add_value('P1082') # population
407 add_amount('P2046') # area
408 add_amount('P281') # postal code
409 add_label('P38') # currency
410 add_amount('P2048') # height (building)
411
412 # Media
413 for p in [
414 'P400', # platform (videogames, computing)
415 'P50', # author
416 'P170', # creator
417 'P57', # director
418 'P175', # performer
419 'P178', # developer
420 'P162', # producer
421 'P176', # manufacturer
422 'P58', # screenwriter
423 'P272', # production company
424 'P264', # record label
425 'P123', # publisher
426 'P449', # original network
427 'P750', # distributed by
428 'P86',
429 ]: # composer
430 add_label(p)
431
432 add_date('P577') # publication date
433 add_label('P136') # genre (music, film, artistic...)
434 add_label('P364') # original language
435 add_value('P212') # ISBN-13
436 add_value('P957') # ISBN-10
437 add_label('P275') # copyright license
438 add_label('P277') # programming language
439 add_value('P348') # version
440 add_label('P840') # narrative location
441
442 # Languages
443 add_value('P1098') # number of speakers
444 add_label('P282') # writing system
445 add_label('P1018') # language regulatory body
446 add_value('P218') # language code (ISO 639-1)
447
448 # Other
449 add_label('P169') # ceo
450 add_label('P112') # founded by
451 add_label('P1454') # legal form (company, organization)
452 add_label('P137') # operator (service, facility, ...)
453 add_label('P1029') # crew members (tripulation)
454 add_label('P225') # taxon name
455 add_value('P274') # chemical formula
456 add_label('P1346') # winner (sports, contests, ...)
457 add_value('P1120') # number of deaths
458 add_value('P498') # currency code (ISO 4217)
459
460 # URL
461 add_url('P856', official=True) # official website
462 attributes.append(WDArticle(language)) # wikipedia (user language)
463 if not language.startswith('en'):
464 attributes.append(WDArticle('en')) # wikipedia (english)
465
466 add_url('P1324') # source code repository
467 add_url('P1581') # blog
468 add_url('P434', url_id='musicbrainz_artist')
469 add_url('P435', url_id='musicbrainz_work')
470 add_url('P436', url_id='musicbrainz_release_group')
471 add_url('P966', url_id='musicbrainz_label')
472 add_url('P345', url_id='imdb_id')
473 add_url('P2397', url_id='youtube_channel')
474 add_url('P1651', url_id='youtube_video')
475 add_url('P2002', url_id='twitter_profile')
476 add_url('P2013', url_id='facebook_profile')
477 add_url('P2003', url_id='instagram_profile')
478
479 # Map
480 attributes.append(WDGeoAttribute('P625'))
481
482 # Image
483 add_image('P15', priority=1, url_id='wikimedia_image') # route map
484 add_image('P242', priority=2, url_id='wikimedia_image') # locator map
485 add_image('P154', priority=3, url_id='wikimedia_image') # logo
486 add_image('P18', priority=4, url_id='wikimedia_image') # image
487 add_image('P41', priority=5, url_id='wikimedia_image') # flag
488 add_image('P2716', priority=6, url_id='wikimedia_image') # collage
489 add_image('P2910', priority=7, url_id='wikimedia_image') # icon
490
491 return attributes
492
493

Referenced by searx.engines.wikidata.get_query(), and searx.engines.wikidata.init().

+ Here is the caller graph for this function:

◆ get_headers()

searx.engines.wikidata.get_headers ( )

Definition at line 140 of file wikidata.py.

140def get_headers():
141 # user agent: https://www.mediawiki.org/wiki/Wikidata_Query_Service/User_Manual#Query_limits
142 return {'Accept': 'application/sparql-results+json', 'User-Agent': searx_useragent()}
143
144

Referenced by searx.engines.wikidata.debug_explain_wikidata_query(), searx.engines.wikidata.request(), and searx.engines.wikidata.send_wikidata_query().

+ Here is the caller graph for this function:

◆ get_label_for_entity()

searx.engines.wikidata.get_label_for_entity ( entity_id,
language )

Definition at line 145 of file wikidata.py.

145def get_label_for_entity(entity_id, language):
146 name = WIKIDATA_PROPERTIES.get(entity_id)
147 if name is None:
148 name = WIKIDATA_PROPERTIES.get((entity_id, language))
149 if name is None:
150 name = WIKIDATA_PROPERTIES.get((entity_id, language.split('-')[0]))
151 if name is None:
152 name = WIKIDATA_PROPERTIES.get((entity_id, 'en'))
153 if name is None:
154 name = entity_id
155 return name
156
157

Referenced by searx.engines.wikidata.WDAttribute.get_label(), and searx.engines.wikidata.WDAmountAttribute.get_str().

+ Here is the caller graph for this function:

◆ get_query()

searx.engines.wikidata.get_query ( query,
language )

Definition at line 336 of file wikidata.py.

336def get_query(query, language):
337 attributes = get_attributes(language)
338 select = [a.get_select() for a in attributes]
339 where = list(filter(lambda s: len(s) > 0, [a.get_where() for a in attributes]))
340 wikibase_label = list(filter(lambda s: len(s) > 0, [a.get_wikibase_label() for a in attributes]))
341 group_by = list(filter(lambda s: len(s) > 0, [a.get_group_by() for a in attributes]))
342 query = (
343 QUERY_TEMPLATE.replace('%QUERY%', sparql_string_escape(query))
344 .replace('%SELECT%', ' '.join(select))
345 .replace('%WHERE%', '\n '.join(where))
346 .replace('%WIKIBASE_LABELS%', '\n '.join(wikibase_label))
347 .replace('%GROUP_BY%', ' '.join(group_by))
348 .replace('%LANGUAGE%', language)
349 )
350 return query, attributes
351
352

References searx.engines.wikidata.get_attributes().

Referenced by searx.engines.wikidata.request().

+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ get_results()

searx.engines.wikidata.get_results ( attribute_result,
attributes,
language )

Definition at line 253 of file wikidata.py.

253def get_results(attribute_result, attributes, language):
254 # pylint: disable=too-many-branches
255 results = []
256 infobox_title = attribute_result.get('itemLabel')
257 infobox_id = attribute_result['item']
258 infobox_id_lang = None
259 infobox_urls = []
260 infobox_attributes = []
261 infobox_content = attribute_result.get('itemDescription', [])
262 img_src = None
263 img_src_priority = 0
264
265 for attribute in attributes:
266 value = attribute.get_str(attribute_result, language)
267 if value is not None and value != '':
268 attribute_type = type(attribute)
269
270 if attribute_type in (WDURLAttribute, WDArticle):
271 # get_select() method : there is group_concat(distinct ...;separator=", ")
272 # split the value here
273 for url in value.split(', '):
274 infobox_urls.append({'title': attribute.get_label(language), 'url': url, **attribute.kwargs})
275 # "normal" results (not infobox) include official website and Wikipedia links.
276 if "list" in display_type and (attribute.kwargs.get('official') or attribute_type == WDArticle):
277 results.append({'title': infobox_title, 'url': url, "content": infobox_content})
278
279 # update the infobox_id with the wikipedia URL
280 # first the local wikipedia URL, and as fallback the english wikipedia URL
281 if attribute_type == WDArticle and (
282 (attribute.language == 'en' and infobox_id_lang is None) or attribute.language != 'en'
283 ):
284 infobox_id_lang = attribute.language
285 infobox_id = url
286 elif attribute_type == WDImageAttribute:
287 # this attribute is an image.
288 # replace the current image only the priority is lower
289 # (the infobox contain only one image).
290 if attribute.priority > img_src_priority:
291 img_src = get_thumbnail(value)
292 img_src_priority = attribute.priority
293 elif attribute_type == WDGeoAttribute:
294 # geocoordinate link
295 # use the area to get the OSM zoom
296 # Note: ignore the unit (must be kmĀ² otherwise the calculation is wrong)
297 # Should use normalized value p:P2046/psn:P2046/wikibase:quantityAmount
298 area = attribute_result.get('P2046')
299 osm_zoom = area_to_osm_zoom(area) if area else 19
300 url = attribute.get_geo_url(attribute_result, osm_zoom=osm_zoom)
301 if url:
302 infobox_urls.append({'title': attribute.get_label(language), 'url': url, 'entity': attribute.name})
303 else:
304 infobox_attributes.append(
305 {'label': attribute.get_label(language), 'value': value, 'entity': attribute.name}
306 )
307
308 if infobox_id:
309 infobox_id = replace_http_by_https(infobox_id)
310
311 # add the wikidata URL at the end
312 infobox_urls.append({'title': 'Wikidata', 'url': attribute_result['item']})
313
314 if (
315 "list" in display_type
316 and img_src is None
317 and len(infobox_attributes) == 0
318 and len(infobox_urls) == 1
319 and len(infobox_content) == 0
320 ):
321 results.append({'url': infobox_urls[0]['url'], 'title': infobox_title, 'content': infobox_content})
322 elif "infobox" in display_type:
323 results.append(
324 {
325 'infobox': infobox_title,
326 'id': infobox_id,
327 'content': infobox_content,
328 'img_src': img_src,
329 'urls': infobox_urls,
330 'attributes': infobox_attributes,
331 }
332 )
333 return results
334
335

References searx.engines.wikidata.replace_http_by_https.

Referenced by searx.engines.wikidata.response().

+ Here is the caller graph for this function:

◆ get_thumbnail()

searx.engines.wikidata.get_thumbnail ( img_src)
Get Thumbnail image from wikimedia commons

Images from commons.wikimedia.org are (HTTP) redirected to
upload.wikimedia.org.  The redirected URL can be calculated by this
function.

- https://stackoverflow.com/a/33691240

Definition at line 214 of file wikidata.py.

214def get_thumbnail(img_src):
215 """Get Thumbnail image from wikimedia commons
216
217 Images from commons.wikimedia.org are (HTTP) redirected to
218 upload.wikimedia.org. The redirected URL can be calculated by this
219 function.
220
221 - https://stackoverflow.com/a/33691240
222
223 """
224 logger.debug('get_thumbnail(): %s', img_src)
225 if not img_src is None and _IMG_SRC_DEFAULT_URL_PREFIX in img_src.split()[0]:
226 img_src_name = unquote(img_src.replace(_IMG_SRC_DEFAULT_URL_PREFIX, "").split("?", 1)[0].replace("%20", "_"))
227 img_src_name_first = img_src_name
228 img_src_name_second = img_src_name
229
230 if ".svg" in img_src_name.split()[0]:
231 img_src_name_second = img_src_name + ".png"
232
233 img_src_size = img_src.replace(_IMG_SRC_DEFAULT_URL_PREFIX, "").split("?", 1)[1]
234 img_src_size = img_src_size[img_src_size.index("=") + 1 : img_src_size.index("&")]
235 img_src_name_md5 = md5(img_src_name.encode("utf-8")).hexdigest()
236 img_src = (
237 _IMG_SRC_NEW_URL_PREFIX
238 + img_src_name_md5[0]
239 + "/"
240 + img_src_name_md5[0:2]
241 + "/"
242 + img_src_name_first
243 + "/"
244 + img_src_size
245 + "px-"
246 + img_src_name_second
247 )
248 logger.debug('get_thumbnail() redirected: %s', img_src)
249
250 return img_src
251
252

◆ init()

searx.engines.wikidata.init ( engine_settings = None)

Definition at line 763 of file wikidata.py.

763def init(engine_settings=None): # pylint: disable=unused-argument
764 # WIKIDATA_PROPERTIES : add unit symbols
765 for k, v in WIKIDATA_UNITS.items():
766 WIKIDATA_PROPERTIES[k] = v['symbol']
767
768 # WIKIDATA_PROPERTIES : add property labels
769 wikidata_property_names = []
770 for attribute in get_attributes('en'):
771 if type(attribute) in (WDAttribute, WDAmountAttribute, WDURLAttribute, WDDateAttribute, WDLabelAttribute):
772 if attribute.name not in WIKIDATA_PROPERTIES:
773 wikidata_property_names.append("wd:" + attribute.name)
774 query = QUERY_PROPERTY_NAMES.replace('%ATTRIBUTES%', " ".join(wikidata_property_names))
775 jsonresponse = send_wikidata_query(query)
776 for result in jsonresponse.get('results', {}).get('bindings', {}):
777 name = result['name']['value']
778 lang = result['name']['xml:lang']
779 entity_id = result['item']['value'].replace('http://www.wikidata.org/entity/', '')
780 WIKIDATA_PROPERTIES[(entity_id, lang)] = name.capitalize()
781
782

References searx.engines.wikidata.get_attributes().

+ Here is the call graph for this function:

◆ request()

searx.engines.wikidata.request ( query,
params )

Definition at line 172 of file wikidata.py.

172def request(query, params):
173
174 eng_tag, _wiki_netloc = get_wiki_params(params['searxng_locale'], traits)
175 query, attributes = get_query(query, eng_tag)
176 logger.debug("request --> language %s // len(attributes): %s", eng_tag, len(attributes))
177
178 params['method'] = 'POST'
179 params['url'] = SPARQL_ENDPOINT_URL
180 params['data'] = {'query': query}
181 params['headers'] = get_headers()
182 params['language'] = eng_tag
183 params['attributes'] = attributes
184
185 return params
186
187

References searx.engines.wikidata.get_headers(), and searx.engines.wikidata.get_query().

+ Here is the call graph for this function:

◆ response()

searx.engines.wikidata.response ( resp)

Definition at line 188 of file wikidata.py.

188def response(resp):
189
190 results = []
191 jsonresponse = loads(resp.content.decode())
192
193 language = resp.search_params['language']
194 attributes = resp.search_params['attributes']
195 logger.debug("request --> language %s // len(attributes): %s", language, len(attributes))
196
197 seen_entities = set()
198 for result in jsonresponse.get('results', {}).get('bindings', []):
199 attribute_result = {key: value['value'] for key, value in result.items()}
200 entity_url = attribute_result['item']
201 if entity_url not in seen_entities and entity_url not in DUMMY_ENTITY_URLS:
202 seen_entities.add(entity_url)
203 results += get_results(attribute_result, attributes, language)
204 else:
205 logger.debug('The SPARQL request returns duplicate entities: %s', str(attribute_result))
206
207 return results
208
209

References searx.engines.wikidata.get_results().

+ Here is the call graph for this function:

◆ send_wikidata_query()

searx.engines.wikidata.send_wikidata_query ( query,
method = 'GET' )

Definition at line 158 of file wikidata.py.

158def send_wikidata_query(query, method='GET'):
159 if method == 'GET':
160 # query will be cached by wikidata
161 http_response = get(SPARQL_ENDPOINT_URL + '?' + urlencode({'query': query}), headers=get_headers())
162 else:
163 # query won't be cached by wikidata
164 http_response = post(SPARQL_ENDPOINT_URL, data={'query': query}, headers=get_headers())
165 if http_response.status_code != 200:
166 logger.debug('SPARQL endpoint error %s', http_response.content.decode())
167 logger.debug('request time %s', str(http_response.elapsed))
168 http_response.raise_for_status()
169 return loads(http_response.content.decode())
170
171

References searx.engines.wikidata.get_headers().

+ Here is the call graph for this function:

Variable Documentation

◆ _IMG_SRC_DEFAULT_URL_PREFIX

str searx.engines.wikidata._IMG_SRC_DEFAULT_URL_PREFIX = "https://commons.wikimedia.org/wiki/Special:FilePath/"
protected

Definition at line 210 of file wikidata.py.

◆ _IMG_SRC_NEW_URL_PREFIX

str searx.engines.wikidata._IMG_SRC_NEW_URL_PREFIX = "https://upload.wikimedia.org/wikipedia/commons/thumb/"
protected

Definition at line 211 of file wikidata.py.

◆ about

dict searx.engines.wikidata.about
Initial value:
1= {
2 "website": 'https://wikidata.org/',
3 "wikidata_id": 'Q2013',
4 "official_api_documentation": 'https://query.wikidata.org/',
5 "use_official_api": True,
6 "require_api_key": False,
7 "results": 'JSON',
8}

Definition at line 34 of file wikidata.py.

◆ display_type

list searx.engines.wikidata.display_type = ["infobox"]

Definition at line 43 of file wikidata.py.

◆ DUMMY_ENTITY_URLS

searx.engines.wikidata.DUMMY_ENTITY_URLS
Initial value:
1= set(
2 "http://www.wikidata.org/entity/" + wid for wid in ("Q4115189", "Q13406268", "Q15397819", "Q17339402")
3)

Definition at line 115 of file wikidata.py.

◆ logger

logging searx.engines.wikidata.logger .Logger

Definition at line 29 of file wikidata.py.

◆ QUERY_PROPERTY_NAMES

str searx.engines.wikidata.QUERY_PROPERTY_NAMES
Initial value:
1= """
2SELECT ?item ?name
3WHERE {
4 {
5 SELECT ?item
6 WHERE { ?item wdt:P279* wd:Q12132 }
7 } UNION {
8 VALUES ?item { %ATTRIBUTES% }
9 }
10 OPTIONAL { ?item rdfs:label ?name. }
11}
12"""

Definition at line 100 of file wikidata.py.

◆ QUERY_TEMPLATE

str searx.engines.wikidata.QUERY_TEMPLATE
Initial value:
1= """
2SELECT ?item ?itemLabel ?itemDescription ?lat ?long %SELECT%
3WHERE
4{
5 SERVICE wikibase:mwapi {
6 bd:serviceParam wikibase:endpoint "www.wikidata.org";
7 wikibase:api "EntitySearch";
8 wikibase:limit 1;
9 mwapi:search "%QUERY%";
10 mwapi:language "%LANGUAGE%".
11 ?item wikibase:apiOutputItem mwapi:item.
12 }
13 hint:Prior hint:runFirst "true".
14
15 %WHERE%
16
17 SERVICE wikibase:label {
18 bd:serviceParam wikibase:language "%LANGUAGE%,en".
19 ?item rdfs:label ?itemLabel .
20 ?item schema:description ?itemDescription .
21 %WIKIBASE_LABELS%
22 }
23
24}
25GROUP BY ?item ?itemLabel ?itemDescription ?lat ?long %GROUP_BY%
26"""

Definition at line 72 of file wikidata.py.

◆ replace_http_by_https

searx.engines.wikidata.replace_http_by_https = get_string_replaces_function({'http:': 'https:'})

Definition at line 137 of file wikidata.py.

Referenced by searx.engines.wikidata.get_results().

◆ SPARQL_ENDPOINT_URL

str searx.engines.wikidata.SPARQL_ENDPOINT_URL = 'https://query.wikidata.org/sparql'

Definition at line 50 of file wikidata.py.

◆ SPARQL_EXPLAIN_URL

str searx.engines.wikidata.SPARQL_EXPLAIN_URL = 'https://query.wikidata.org/bigdata/namespace/wdq/sparql?explain'

Definition at line 51 of file wikidata.py.

◆ sparql_string_escape

searx.engines.wikidata.sparql_string_escape
Initial value:
1= get_string_replaces_function(
2 # fmt: off
3 {
4 '\t': '\\\t',
5 '\n': '\\\n',
6 '\r': '\\\r',
7 '\b': '\\\b',
8 '\f': '\\\f',
9 '\"': '\\\"',
10 '\'': '\\\'',
11 '\\': '\\\\'
12 }
13 # fmt: on
14)

Definition at line 122 of file wikidata.py.

◆ WIKIDATA_PROPERTIES

dict searx.engines.wikidata.WIKIDATA_PROPERTIES
Initial value:
1= {
2 'P434': 'MusicBrainz',
3 'P435': 'MusicBrainz',
4 'P436': 'MusicBrainz',
5 'P966': 'MusicBrainz',
6 'P345': 'IMDb',
7 'P2397': 'YouTube',
8 'P1651': 'YouTube',
9 'P2002': 'Twitter',
10 'P2013': 'Facebook',
11 'P2003': 'Instagram',
12}

Definition at line 52 of file wikidata.py.