[fix] searxng_extra/update/update_engine_descriptions.py (part 1)

Follow up of #2269

The script to update the descriptions of the engines does no longer work since
PR #2269 has been merged.

searx/engines/wikipedia.py
==========================

1. There was a misusage of zh-classical.wikipedia.org:

   - `zh-classical` is dedicate to classical Chinese [1] which is not
     traditional Chinese [2].

   - zh.wikipedia.org has LanguageConverter enabled [3] and is going to
     dynamically show simplified or traditional Chinese according to the
     HTTP Accept-Language header.

2. The update_engine_descriptions.py needs a list of all wikipedias.  The
   implementation from #2269 included only a reduced list:

   - https://meta.wikimedia.org/wiki/Wikipedia_article_depth
   - https://meta.wikimedia.org/wiki/List_of_Wikipedias

searxng_extra/update/update_engine_descriptions.py
==================================================

Before PR #2269 there was a match_language() function that did an approximation
using various methods.  With PR #2269 there are only the types in the data model
of the languages, which can be recognized by babel.  The approximation methods,
which are needed (only here) in the determination of the descriptions, must be
replaced by other methods.

[1] https://en.wikipedia.org/wiki/Classical_Chinese
[2] https://en.wikipedia.org/wiki/Traditional_Chinese_characters
[3] https://www.mediawiki.org/wiki/Writing_systems#LanguageConverter

Closes: https://github.com/searxng/searxng/issues/2330
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
This commit is contained in:
Markus Heiser 2023-04-04 15:17:12 +02:00
parent 0adfed195e
commit 27369ebec2
5 changed files with 262 additions and 125 deletions

View file

@ -18,7 +18,10 @@ from searx.data import WIKIDATA_UNITS
from searx.network import post, get
from searx.utils import searx_useragent, get_string_replaces_function
from searx.external_urls import get_external_url, get_earth_coordinates_url, area_to_osm_zoom
from searx.engines.wikipedia import fetch_traits as _fetch_traits
from searx.engines.wikipedia import (
fetch_wikimedia_traits,
get_wiki_params,
)
from searx.enginelib.traits import EngineTraits
if TYPE_CHECKING:
@ -165,17 +168,15 @@ def request(query, params):
# wikidata does not support zh-classical (zh_Hans) / zh-TW, zh-HK and zh-CN
# mapped to zh
sxng_lang = params['searxng_locale'].split('-')[0]
language = traits.get_language(sxng_lang, 'en')
query, attributes = get_query(query, language)
logger.debug("request --> language %s // len(attributes): %s", language, len(attributes))
eng_tag, _wiki_netloc = get_wiki_params(params['searxng_locale'], traits)
query, attributes = get_query(query, eng_tag)
logger.debug("request --> language %s // len(attributes): %s", eng_tag, len(attributes))
params['method'] = 'POST'
params['url'] = SPARQL_ENDPOINT_URL
params['data'] = {'query': query}
params['headers'] = get_headers()
params['language'] = language
params['language'] = eng_tag
params['attributes'] = attributes
return params
@ -769,12 +770,16 @@ def init(engine_settings=None): # pylint: disable=unused-argument
def fetch_traits(engine_traits: EngineTraits):
"""Use languages evaluated from :py:obj:`wikipedia.fetch_traits
<searx.engines.wikipedia.fetch_traits>` except zh-classical (zh_Hans) what
is not supported by wikidata."""
"""Uses languages evaluated from :py:obj:`wikipedia.fetch_wikimedia_traits
<searx.engines.wikipedia.fetch_wikimedia_traits>` and removes
_fetch_traits(engine_traits)
# wikidata does not support zh-classical (zh_Hans)
engine_traits.languages.pop('zh_Hans')
# wikidata does not have net-locations for the languages
- ``traits.custom['wiki_netloc']``: wikidata does not have net-locations for
the languages and the list of all
- ``traits.custom['WIKIPEDIA_LANGUAGES']``: not used in the wikipedia engine
"""
fetch_wikimedia_traits(engine_traits)
engine_traits.custom['wiki_netloc'] = {}
engine_traits.custom['WIKIPEDIA_LANGUAGES'] = []