.oO SearXNG Developer Documentation Oo.
Loading...
Searching...
No Matches
searx.engines.startpage Namespace Reference

Functions

 get_sc_code (searxng_locale, params)
 
 request (query, params)
 
 _request_cat_web (query, params)
 
 response (resp)
 
 _response_cat_web (dom)
 
 fetch_traits (EngineTraits engine_traits)
 

Variables

logging logger .Logger
 
EngineTraits traits
 
dict about
 
str startpage_categ = 'web'
 
bool send_accept_language_header = True
 
list categories = ['general', 'web']
 
bool paging = True
 
int max_page = 18
 
bool time_range_support = True
 
bool safesearch = True
 
dict time_range_dict = {'day': 'd', 'week': 'w', 'month': 'm', 'year': 'y'}
 
dict safesearch_dict = {0: '0', 1: '1', 2: '1'}
 
str base_url = 'https://www.startpage.com'
 
str search_url = base_url + '/sp/search'
 
str search_form_xpath = '//form[@id="search"]'
 
int sc_code_ts = 0
 
str sc_code = ''
 
int sc_code_cache_sec = 30
 

Detailed Description

Startpage's language & region selectors are a mess ..

.. _startpage regions:

Startpage regions
=================

In the list of regions there are tags we need to map to common region tags::

  pt-BR_BR --> pt_BR
  zh-CN_CN --> zh_Hans_CN
  zh-TW_TW --> zh_Hant_TW
  zh-TW_HK --> zh_Hant_HK
  en-GB_GB --> en_GB

and there is at least one tag with a three letter language tag (ISO 639-2)::

  fil_PH --> fil_PH

The locale code ``no_NO`` from Startpage does not exists and is mapped to
``nb-NO``::

    babel.core.UnknownLocaleError: unknown locale 'no_NO'

For reference see languages-subtag at iana; ``no`` is the macrolanguage [1]_ and
W3C recommends subtag over macrolanguage [2]_.

.. [1] `iana: language-subtag-registry
   <https://www.iana.org/assignments/language-subtag-registry/language-subtag-registry>`_ ::

      type: language
      Subtag: nb
      Description: Norwegian Bokmål
      Added: 2005-10-16
      Suppress-Script: Latn
      Macrolanguage: no

.. [2]
   Use macrolanguages with care.  Some language subtags have a Scope field set to
   macrolanguage, i.e. this primary language subtag encompasses a number of more
   specific primary language subtags in the registry.  ...  As we recommended for
   the collection subtags mentioned above, in most cases you should try to use
   the more specific subtags ... `W3: The primary language subtag
   <https://www.w3.org/International/questions/qa-choosing-language-tags#langsubtag>`_

.. _startpage languages:

Startpage languages
===================

:py:obj:`send_accept_language_header`:
  The displayed name in Startpage's settings page depend on the location of the
  IP when ``Accept-Language`` HTTP header is unset.  In :py:obj:`fetch_traits`
  we use::

    'Accept-Language': "en-US,en;q=0.5",
    ..

  to get uniform names independent from the IP).

.. _startpage categories:

Startpage categories
====================

Startpage's category (for Web-search, News, Videos, ..) is set by
:py:obj:`startpage_categ` in  settings.yml::

  - name: startpage
    engine: startpage
    startpage_categ: web
    ...

.. hint::

   The default category is ``web`` .. and other categories than ``web`` are not
   yet implemented.

Function Documentation

◆ _request_cat_web()

searx.engines.startpage._request_cat_web ( query,
params )
protected

Definition at line 259 of file startpage.py.

259def _request_cat_web(query, params):
260
261 engine_region = traits.get_region(params['searxng_locale'], 'en-US')
262 engine_language = traits.get_language(params['searxng_locale'], 'en')
263
264 # build arguments
265 args = {
266 'query': query,
267 'cat': 'web',
268 't': 'device',
269 'sc': get_sc_code(params['searxng_locale'], params), # hint: this func needs HTTP headers,
270 'with_date': time_range_dict.get(params['time_range'], ''),
271 }
272
273 if engine_language:
274 args['language'] = engine_language
275 args['lui'] = engine_language
276
277 args['abp'] = '1'
278 if params['pageno'] > 1:
279 args['page'] = params['pageno']
280
281 # build cookie
282 lang_homepage = 'en'
283 cookie = OrderedDict()
284 cookie['date_time'] = 'world'
285 cookie['disable_family_filter'] = safesearch_dict[params['safesearch']]
286 cookie['disable_open_in_new_window'] = '0'
287 cookie['enable_post_method'] = '1' # hint: POST
288 cookie['enable_proxy_safety_suggest'] = '1'
289 cookie['enable_stay_control'] = '1'
290 cookie['instant_answers'] = '1'
291 cookie['lang_homepage'] = 's/device/%s/' % lang_homepage
292 cookie['num_of_results'] = '10'
293 cookie['suggestions'] = '1'
294 cookie['wt_unit'] = 'celsius'
295
296 if engine_language:
297 cookie['language'] = engine_language
298 cookie['language_ui'] = engine_language
299
300 if engine_region:
301 cookie['search_results_region'] = engine_region
302
303 params['cookies']['preferences'] = 'N1N'.join(["%sEEE%s" % x for x in cookie.items()])
304 logger.debug('cookie preferences: %s', params['cookies']['preferences'])
305
306 # POST request
307 logger.debug("data: %s", args)
308 params['data'] = args
309 params['method'] = 'POST'
310 params['url'] = search_url
311 params['headers']['Origin'] = base_url
312 params['headers']['Referer'] = base_url + '/'
313 # is the Accept header needed?
314 # params['headers']['Accept'] = 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8'
315
316 return params
317
318
319# get response from search-request

References searx.engines.startpage.get_sc_code().

Referenced by searx.engines.startpage.request().

+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ _response_cat_web()

searx.engines.startpage._response_cat_web ( dom)
protected

Definition at line 330 of file startpage.py.

330def _response_cat_web(dom):
331 results = []
332
333 # parse results
334 for result in eval_xpath(dom, '//div[@class="w-gl"]/div[contains(@class, "result")]'):
335 links = eval_xpath(result, './/a[contains(@class, "result-title result-link")]')
336 if not links:
337 continue
338 link = links[0]
339 url = link.attrib.get('href')
340
341 # block google-ad url's
342 if re.match(r"^http(s|)://(www\.)?google\.[a-z]+/aclk.*$", url):
343 continue
344
345 # block startpage search url's
346 if re.match(r"^http(s|)://(www\.)?startpage\.com/do/search\?.*$", url):
347 continue
348
349 title = extract_text(eval_xpath(link, 'h2'))
350 content = eval_xpath(result, './/p[contains(@class, "description")]')
351 content = extract_text(content, allow_none=True) or ''
352
353 published_date = None
354
355 # check if search result starts with something like: "2 Sep 2014 ... "
356 if re.match(r"^([1-9]|[1-2][0-9]|3[0-1]) [A-Z][a-z]{2} [0-9]{4} \.\.\. ", content):
357 date_pos = content.find('...') + 4
358 date_string = content[0 : date_pos - 5]
359 # fix content string
360 content = content[date_pos:]
361
362 try:
363 published_date = dateutil.parser.parse(date_string, dayfirst=True)
364 except ValueError:
365 pass
366
367 # check if search result starts with something like: "5 days ago ... "
368 elif re.match(r"^[0-9]+ days? ago \.\.\. ", content):
369 date_pos = content.find('...') + 4
370 date_string = content[0 : date_pos - 5]
371
372 # calculate datetime
373 published_date = datetime.now() - timedelta(days=int(re.match(r'\d+', date_string).group())) # type: ignore
374
375 # fix content string
376 content = content[date_pos:]
377
378 if published_date:
379 # append result
380 results.append({'url': url, 'title': title, 'content': content, 'publishedDate': published_date})
381 else:
382 # append result
383 results.append({'url': url, 'title': title, 'content': content})
384
385 # return results
386 return results
387
388

Referenced by searx.engines.startpage.response().

+ Here is the caller graph for this function:

◆ fetch_traits()

searx.engines.startpage.fetch_traits ( EngineTraits engine_traits)
Fetch :ref:`languages <startpage languages>` and :ref:`regions <startpage
regions>` from Startpage.

Definition at line 389 of file startpage.py.

389def fetch_traits(engine_traits: EngineTraits):
390 """Fetch :ref:`languages <startpage languages>` and :ref:`regions <startpage
391 regions>` from Startpage."""
392 # pylint: disable=too-many-branches
393
394 headers = {
395 'User-Agent': gen_useragent(),
396 'Accept-Language': "en-US,en;q=0.5", # bing needs to set the English language
397 }
398 resp = get('https://www.startpage.com/do/settings', headers=headers)
399
400 if not resp.ok: # type: ignore
401 print("ERROR: response from Startpage is not OK.")
402
403 dom = lxml.html.fromstring(resp.text) # type: ignore
404
405 # regions
406
407 sp_region_names = []
408 for option in dom.xpath('//form[@name="settings"]//select[@name="search_results_region"]/option'):
409 sp_region_names.append(option.get('value'))
410
411 for eng_tag in sp_region_names:
412 if eng_tag == 'all':
413 continue
414 babel_region_tag = {'no_NO': 'nb_NO'}.get(eng_tag, eng_tag) # norway
415
416 if '-' in babel_region_tag:
417 l, r = babel_region_tag.split('-')
418 r = r.split('_')[-1]
419 sxng_tag = region_tag(babel.Locale.parse(l + '_' + r, sep='_'))
420
421 else:
422 try:
423 sxng_tag = region_tag(babel.Locale.parse(babel_region_tag, sep='_'))
424
425 except babel.UnknownLocaleError:
426 print("ERROR: can't determine babel locale of startpage's locale %s" % eng_tag)
427 continue
428
429 conflict = engine_traits.regions.get(sxng_tag)
430 if conflict:
431 if conflict != eng_tag:
432 print("CONFLICT: babel %s --> %s, %s" % (sxng_tag, conflict, eng_tag))
433 continue
434 engine_traits.regions[sxng_tag] = eng_tag
435
436 # languages
437
438 catalog_engine2code = {name.lower(): lang_code for lang_code, name in babel.Locale('en').languages.items()}
439
440 # get the native name of every language known by babel
441
442 for lang_code in filter(
443 lambda lang_code: lang_code.find('_') == -1, babel.localedata.locale_identifiers() # type: ignore
444 ):
445 native_name = babel.Locale(lang_code).get_language_name().lower() # type: ignore
446 # add native name exactly as it is
447 catalog_engine2code[native_name] = lang_code
448
449 # add "normalized" language name (i.e. français becomes francais and español becomes espanol)
450 unaccented_name = ''.join(filter(lambda c: not combining(c), normalize('NFKD', native_name)))
451 if len(unaccented_name) == len(unaccented_name.encode()):
452 # add only if result is ascii (otherwise "normalization" didn't work)
453 catalog_engine2code[unaccented_name] = lang_code
454
455 # values that can't be determined by babel's languages names
456
457 catalog_engine2code.update(
458 {
459 # traditional chinese used in ..
460 'fantizhengwen': 'zh_Hant',
461 # Korean alphabet
462 'hangul': 'ko',
463 # Malayalam is one of 22 scheduled languages of India.
464 'malayam': 'ml',
465 'norsk': 'nb',
466 'sinhalese': 'si',
467 }
468 )
469
470 skip_eng_tags = {
471 'english_uk', # SearXNG lang 'en' already maps to 'english'
472 }
473
474 for option in dom.xpath('//form[@name="settings"]//select[@name="language"]/option'):
475
476 eng_tag = option.get('value')
477 if eng_tag in skip_eng_tags:
478 continue
479 name = extract_text(option).lower() # type: ignore
480
481 sxng_tag = catalog_engine2code.get(eng_tag)
482 if sxng_tag is None:
483 sxng_tag = catalog_engine2code[name]
484
485 conflict = engine_traits.languages.get(sxng_tag)
486 if conflict:
487 if conflict != eng_tag:
488 print("CONFLICT: babel %s --> %s, %s" % (sxng_tag, conflict, eng_tag))
489 continue
490 engine_traits.languages[sxng_tag] = eng_tag

◆ get_sc_code()

searx.engines.startpage.get_sc_code ( searxng_locale,
params )
Get an actual ``sc`` argument from Startpage's search form (HTML page).

Startpage puts a ``sc`` argument on every HTML :py:obj:`search form
<search_form_xpath>`.  Without this argument Startpage considers the request
is from a bot.  We do not know what is encoded in the value of the ``sc``
argument, but it seems to be a kind of a *time-stamp*.

Startpage's search form generates a new sc-code on each request.  This
function scrap a new sc-code from Startpage's home page every
:py:obj:`sc_code_cache_sec` seconds.

Definition at line 167 of file startpage.py.

167def get_sc_code(searxng_locale, params):
168 """Get an actual ``sc`` argument from Startpage's search form (HTML page).
169
170 Startpage puts a ``sc`` argument on every HTML :py:obj:`search form
171 <search_form_xpath>`. Without this argument Startpage considers the request
172 is from a bot. We do not know what is encoded in the value of the ``sc``
173 argument, but it seems to be a kind of a *time-stamp*.
174
175 Startpage's search form generates a new sc-code on each request. This
176 function scrap a new sc-code from Startpage's home page every
177 :py:obj:`sc_code_cache_sec` seconds.
178
179 """
180
181 global sc_code_ts, sc_code # pylint: disable=global-statement
182
183 if sc_code and (time() < (sc_code_ts + sc_code_cache_sec)):
184 logger.debug("get_sc_code: reuse '%s'", sc_code)
185 return sc_code
186
187 headers = {**params['headers']}
188 headers['Origin'] = base_url
189 headers['Referer'] = base_url + '/'
190 # headers['Connection'] = 'keep-alive'
191 # headers['Accept-Encoding'] = 'gzip, deflate, br'
192 # headers['Accept'] = 'text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,*/*;q=0.8'
193 # headers['User-Agent'] = 'Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:105.0) Gecko/20100101 Firefox/105.0'
194
195 # add Accept-Language header
196 if searxng_locale == 'all':
197 searxng_locale = 'en-US'
198 locale = babel.Locale.parse(searxng_locale, sep='-')
199
200 if send_accept_language_header:
201 ac_lang = locale.language
202 if locale.territory:
203 ac_lang = "%s-%s,%s;q=0.9,*;q=0.5" % (
204 locale.language,
205 locale.territory,
206 locale.language,
207 )
208 headers['Accept-Language'] = ac_lang
209
210 get_sc_url = base_url + '/?sc=%s' % (sc_code)
211 logger.debug("query new sc time-stamp ... %s", get_sc_url)
212 logger.debug("headers: %s", headers)
213 resp = get(get_sc_url, headers=headers)
214
215 # ?? x = network.get('https://www.startpage.com/sp/cdn/images/filter-chevron.svg', headers=headers)
216 # ?? https://www.startpage.com/sp/cdn/images/filter-chevron.svg
217 # ?? ping-back URL: https://www.startpage.com/sp/pb?sc=TLsB0oITjZ8F21
218
219 if str(resp.url).startswith('https://www.startpage.com/sp/captcha'): # type: ignore
220 raise SearxEngineCaptchaException(
221 message="get_sc_code: got redirected to https://www.startpage.com/sp/captcha",
222 )
223
224 dom = lxml.html.fromstring(resp.text) # type: ignore
225
226 try:
227 sc_code = eval_xpath(dom, search_form_xpath + '//input[@name="sc"]/@value')[0]
228 except IndexError as exc:
229 logger.debug("suspend startpage API --> https://github.com/searxng/searxng/pull/695")
230 raise SearxEngineCaptchaException(
231 message="get_sc_code: [PR-695] query new sc time-stamp failed! (%s)" % resp.url, # type: ignore
232 ) from exc
233
234 sc_code_ts = time()
235 logger.debug("get_sc_code: new value is: %s", sc_code)
236 return sc_code
237
238

Referenced by searx.engines.startpage._request_cat_web().

+ Here is the caller graph for this function:

◆ request()

searx.engines.startpage.request ( query,
params )
Assemble a Startpage request.

To avoid CAPTCHA we need to send a well formed HTTP POST request with a
cookie.  We need to form a request that is identical to the request build by
Startpage's search form:

- in the cookie the **region** is selected
- in the HTTP POST data the **language** is selected

Additionally the arguments form Startpage's search form needs to be set in
HTML POST data / compare ``<input>`` elements: :py:obj:`search_form_xpath`.

Definition at line 239 of file startpage.py.

239def request(query, params):
240 """Assemble a Startpage request.
241
242 To avoid CAPTCHA we need to send a well formed HTTP POST request with a
243 cookie. We need to form a request that is identical to the request build by
244 Startpage's search form:
245
246 - in the cookie the **region** is selected
247 - in the HTTP POST data the **language** is selected
248
249 Additionally the arguments form Startpage's search form needs to be set in
250 HTML POST data / compare ``<input>`` elements: :py:obj:`search_form_xpath`.
251 """
252 if startpage_categ == 'web':
253 return _request_cat_web(query, params)
254
255 logger.error("Startpages's category '%' is not yet implemented.", startpage_categ)
256 return params
257
258

References searx.engines.startpage._request_cat_web().

+ Here is the call graph for this function:

◆ response()

searx.engines.startpage.response ( resp)

Definition at line 320 of file startpage.py.

320def response(resp):
321 dom = lxml.html.fromstring(resp.text)
322
323 if startpage_categ == 'web':
324 return _response_cat_web(dom)
325
326 logger.error("Startpages's category '%' is not yet implemented.", startpage_categ)
327 return []
328
329

References searx.engines.startpage._response_cat_web().

+ Here is the call graph for this function:

Variable Documentation

◆ about

dict searx.engines.startpage.about
Initial value:
1= {
2 "website": 'https://startpage.com',
3 "wikidata_id": 'Q2333295',
4 "official_api_documentation": None,
5 "use_official_api": False,
6 "require_api_key": False,
7 "results": 'HTML',
8}

Definition at line 107 of file startpage.py.

◆ base_url

str searx.engines.startpage.base_url = 'https://www.startpage.com'

Definition at line 139 of file startpage.py.

◆ categories

list searx.engines.startpage.categories = ['general', 'web']

Definition at line 127 of file startpage.py.

◆ logger

logging searx.engines.startpage.logger .Logger

Definition at line 102 of file startpage.py.

◆ max_page

int searx.engines.startpage.max_page = 18

Definition at line 129 of file startpage.py.

◆ paging

bool searx.engines.startpage.paging = True

Definition at line 128 of file startpage.py.

◆ safesearch

bool searx.engines.startpage.safesearch = True

Definition at line 133 of file startpage.py.

◆ safesearch_dict

dict searx.engines.startpage.safesearch_dict = {0: '0', 1: '1', 2: '1'}

Definition at line 136 of file startpage.py.

◆ sc_code

str searx.engines.startpage.sc_code = ''

Definition at line 162 of file startpage.py.

◆ sc_code_cache_sec

int searx.engines.startpage.sc_code_cache_sec = 30

Definition at line 163 of file startpage.py.

◆ sc_code_ts

int searx.engines.startpage.sc_code_ts = 0

Definition at line 161 of file startpage.py.

◆ search_form_xpath

str searx.engines.startpage.search_form_xpath = '//form[@id="search"]'

Definition at line 145 of file startpage.py.

◆ search_url

str searx.engines.startpage.search_url = base_url + '/sp/search'

Definition at line 140 of file startpage.py.

◆ send_accept_language_header

bool searx.engines.startpage.send_accept_language_header = True

Definition at line 120 of file startpage.py.

◆ startpage_categ

str searx.engines.startpage.startpage_categ = 'web'

Definition at line 116 of file startpage.py.

◆ time_range_dict

dict searx.engines.startpage.time_range_dict = {'day': 'd', 'week': 'w', 'month': 'm', 'year': 'y'}

Definition at line 135 of file startpage.py.

◆ time_range_support

bool searx.engines.startpage.time_range_support = True

Definition at line 132 of file startpage.py.

◆ traits

EngineTraits searx.engines.startpage.traits

Definition at line 104 of file startpage.py.