.oO SearXNG Developer Documentation Oo.
Loading...
Searching...
No Matches
searx.engines.bing Namespace Reference

Functions

 _page_offset (pageno)
 set_bing_cookies (params, engine_language, engine_region)
 request (query, params)
 response (resp)
 fetch_traits (EngineTraits engine_traits)

Variables

dict about
list categories = ['general', 'web']
bool paging = True
int max_page = 200
bool time_range_support = True
bool safesearch = True
str base_url = 'https://www.bing.com/search'

Detailed Description

This is the implementation of the Bing-WEB engine. Some of this
implementations are shared by other engines:

- :ref:`bing images engine`
- :ref:`bing news engine`
- :ref:`bing videos engine`

On the `preference page`_ Bing offers a lot of languages an regions (see section
LANGUAGE and COUNTRY/REGION).  The Language is the language of the UI, we need
in SearXNG to get the translations of data such as *"published last week"*.

There is a description of the official search-APIs_, unfortunately this is not
the API we can use or that bing itself would use.  You can look up some things
in the API to get a better picture of bing, but the value specifications like
the market codes are usually outdated or at least no longer used by bing itself.

The market codes have been harmonized and are identical for web, video and
images.  The news area has also been harmonized with the other categories.  Only
political adjustments still seem to be made -- for example, there is no news
category for the Chinese market.

.. _preference page: https://www.bing.com/account/general
.. _search-APIs: https://learn.microsoft.com/en-us/bing/search-apis/

Function Documentation

◆ _page_offset()

searx.engines.bing._page_offset ( pageno)
protected

Definition at line 67 of file bing.py.

67def _page_offset(pageno):
68 return (int(pageno) - 1) * 10 + 1
69
70

Referenced by request(), and response().

Here is the caller graph for this function:

◆ fetch_traits()

searx.engines.bing.fetch_traits ( EngineTraits engine_traits)
Fetch languages and regions from Bing-Web.

Definition at line 189 of file bing.py.

189def fetch_traits(engine_traits: EngineTraits):
190 """Fetch languages and regions from Bing-Web."""
191 # pylint: disable=import-outside-toplevel
192
193 from searx.network import get # see https://github.com/searxng/searxng/issues/762
194 from searx.utils import gen_useragent
195
196 headers = {
197 "User-Agent": gen_useragent(),
198 "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8",
199 "Accept-Language": "en-US;q=0.5,en;q=0.3",
200 "Accept-Encoding": "gzip, deflate, br",
201 "DNT": "1",
202 "Connection": "keep-alive",
203 "Upgrade-Insecure-Requests": "1",
204 "Sec-GPC": "1",
205 "Cache-Control": "max-age=0",
206 }
207
208 resp = get("https://www.bing.com/account/general", headers=headers)
209 if not resp.ok: # type: ignore
210 print("ERROR: response from bing is not OK.")
211
212 dom = html.fromstring(resp.text) # type: ignore
213
214 # languages
215
216 engine_traits.languages['zh'] = 'zh-hans'
217
218 map_lang = {'prs': 'fa-AF', 'en': 'en-us'}
219 bing_ui_lang_map = {
220 # HINT: this list probably needs to be supplemented
221 'en': 'us', # en --> en-us
222 'da': 'dk', # da --> da-dk
223 }
224
225 for href in eval_xpath(dom, '//div[@id="language-section-content"]//div[@class="languageItem"]/a/@href'):
226 eng_lang = parse_qs(urlparse(href).query)['setlang'][0]
227 babel_lang = map_lang.get(eng_lang, eng_lang)
228 try:
229 sxng_tag = language_tag(babel.Locale.parse(babel_lang.replace('-', '_')))
230 except babel.UnknownLocaleError:
231 print("ERROR: language (%s) is unknown by babel" % (babel_lang))
232 continue
233 # Language (e.g. 'en' or 'de') from https://www.bing.com/account/general
234 # is converted by bing to 'en-us' or 'de-de'. But only if there is not
235 # already a '-' delemitter in the language. For instance 'pt-PT' -->
236 # 'pt-pt' and 'pt-br' --> 'pt-br'
237 bing_ui_lang = eng_lang.lower()
238 if '-' not in bing_ui_lang:
239 bing_ui_lang = bing_ui_lang + '-' + bing_ui_lang_map.get(bing_ui_lang, bing_ui_lang)
240
241 conflict = engine_traits.languages.get(sxng_tag)
242 if conflict:
243 if conflict != bing_ui_lang:
244 print(f"CONFLICT: babel {sxng_tag} --> {conflict}, {bing_ui_lang}")
245 continue
246 engine_traits.languages[sxng_tag] = bing_ui_lang
247
248 # regions (aka "market codes")
249
250 engine_traits.regions['zh-CN'] = 'zh-cn'
251
252 map_market_codes = {
253 'zh-hk': 'en-hk', # not sure why, but at M$ this is the market code for Hongkong
254 }
255 for href in eval_xpath(dom, '//div[@id="region-section-content"]//div[@class="regionItem"]/a/@href'):
256 cc_tag = parse_qs(urlparse(href).query)['cc'][0]
257 if cc_tag == 'clear':
258 engine_traits.all_locale = cc_tag
259 continue
260
261 # add market codes from official languages of the country ..
262 for lang_tag in babel.languages.get_official_languages(cc_tag, de_facto=True):
263 if lang_tag not in engine_traits.languages.keys():
264 # print("ignore lang: %s <-- %s" % (cc_tag, lang_tag))
265 continue
266 lang_tag = lang_tag.split('_')[0] # zh_Hant --> zh
267 market_code = f"{lang_tag}-{cc_tag}" # zh-tw
268
269 market_code = map_market_codes.get(market_code, market_code)
270 sxng_tag = region_tag(babel.Locale.parse('%s_%s' % (lang_tag, cc_tag.upper())))
271 conflict = engine_traits.regions.get(sxng_tag)
272 if conflict:
273 if conflict != market_code:
274 print("CONFLICT: babel %s --> %s, %s" % (sxng_tag, conflict, market_code))
275 continue
276 engine_traits.regions[sxng_tag] = market_code

◆ request()

searx.engines.bing.request ( query,
params )
Assemble a Bing-Web request.

Definition at line 77 of file bing.py.

77def request(query, params):
78 """Assemble a Bing-Web request."""
79
80 engine_region = traits.get_region(params['searxng_locale'], traits.all_locale) # type: ignore
81 engine_language = traits.get_language(params['searxng_locale'], 'en') # type: ignore
82 set_bing_cookies(params, engine_language, engine_region)
83
84 page = params.get('pageno', 1)
85 query_params = {
86 'q': query,
87 # if arg 'pq' is missed, sometimes on page 4 we get results from page 1,
88 # don't ask why it is only sometimes / its M$ and they have never been
89 # deterministic ;)
90 'pq': query,
91 }
92
93 # To get correct page, arg first and this arg FORM is needed, the value PERE
94 # is on page 2, on page 3 its PERE1 and on page 4 its PERE2 .. and so forth.
95 # The 'first' arg should never send on page 1.
96
97 if page > 1:
98 query_params['first'] = _page_offset(page) # see also arg FORM
99 if page == 2:
100 query_params['FORM'] = 'PERE'
101 elif page > 2:
102 query_params['FORM'] = 'PERE%s' % (page - 2)
103
104 params['url'] = f'{base_url}?{urlencode(query_params)}'
105
106 if params.get('time_range'):
107 unix_day = int(time.time() / 86400)
108 time_ranges = {'day': '1', 'week': '2', 'month': '3', 'year': f'5_{unix_day-365}_{unix_day}'}
109 params['url'] += f'&filters=ex1:"ez{time_ranges[params["time_range"]]}"'
110
111 return params
112
113

References _page_offset(), and set_bing_cookies().

Here is the call graph for this function:

◆ response()

searx.engines.bing.response ( resp)

Definition at line 114 of file bing.py.

114def response(resp):
115 # pylint: disable=too-many-locals
116
117 results = []
118 result_len = 0
119
120 dom = html.fromstring(resp.text)
121
122 # parse results again if nothing is found yet
123
124 for result in eval_xpath_list(dom, '//ol[@id="b_results"]/li[contains(@class, "b_algo")]'):
125
126 link = eval_xpath_getindex(result, './/h2/a', 0, None)
127 if link is None:
128 continue
129 url = link.attrib.get('href')
130 title = extract_text(link)
131
132 content = eval_xpath(result, './/p')
133 for p in content:
134 # Make sure that the element is free of:
135 # <span class="algoSlug_icon" # data-priority="2">Web</span>
136 for e in p.xpath('.//span[@class="algoSlug_icon"]'):
137 e.getparent().remove(e)
138 content = extract_text(content)
139
140 # get the real URL
141 if url.startswith('https://www.bing.com/ck/a?'):
142 # get the first value of u parameter
143 url_query = urlparse(url).query
144 parsed_url_query = parse_qs(url_query)
145 param_u = parsed_url_query["u"][0]
146 # remove "a1" in front
147 encoded_url = param_u[2:]
148 # add padding
149 encoded_url = encoded_url + '=' * (-len(encoded_url) % 4)
150 # decode base64 encoded URL
151 url = base64.urlsafe_b64decode(encoded_url).decode()
152
153 # append result
154 results.append({'url': url, 'title': title, 'content': content})
155
156 # get number_of_results
157 if results:
158 result_len_container = "".join(eval_xpath(dom, '//span[@class="sb_count"]//text()'))
159 if "-" in result_len_container:
160 start_str, result_len_container = re.split(r'-\d+', result_len_container)
161 start = int(start_str)
162 else:
163 start = 1
164
165 result_len_container = re.sub('[^0-9]', '', result_len_container)
166 if len(result_len_container) > 0:
167 result_len = int(result_len_container)
168
169 expected_start = _page_offset(resp.search_params.get("pageno", 1))
170
171 if expected_start != start:
172 if expected_start > result_len:
173 # Avoid reading more results than available.
174 # For example, if there is 100 results from some search and we try to get results from 120 to 130,
175 # Bing will send back the results from 0 to 10 and no error.
176 # If we compare results count with the first parameter of the request we can avoid this "invalid"
177 # results.
178 return []
179
180 # Sometimes Bing will send back the first result page instead of the requested page as a rate limiting
181 # measure.
182 msg = f"Expected results to start at {expected_start}, but got results starting at {start}"
183 raise SearxEngineAPIException(msg)
184
185 results.append({'number_of_results': result_len})
186 return results
187
188

References _page_offset().

Here is the call graph for this function:

◆ set_bing_cookies()

searx.engines.bing.set_bing_cookies ( params,
engine_language,
engine_region )

Definition at line 71 of file bing.py.

71def set_bing_cookies(params, engine_language, engine_region):
72 params['cookies']['_EDGE_CD'] = f'm={engine_region}&u={engine_language}'
73 params['cookies']['_EDGE_S'] = f'mkt={engine_region}&ui={engine_language}'
74 logger.debug("bing cookies: %s", params['cookies'])
75
76

Referenced by request().

Here is the caller graph for this function:

Variable Documentation

◆ about

dict searx.engines.bing.about
Initial value:
1= {
2 "website": 'https://www.bing.com',
3 "wikidata_id": 'Q182496',
4 "official_api_documentation": 'https://www.microsoft.com/en-us/bing/apis/bing-web-search-api',
5 "use_official_api": False,
6 "require_api_key": False,
7 "results": 'HTML',
8}

Definition at line 42 of file bing.py.

◆ base_url

str searx.engines.bing.base_url = 'https://www.bing.com/search'

Definition at line 63 of file bing.py.

◆ categories

list searx.engines.bing.categories = ['general', 'web']

Definition at line 52 of file bing.py.

◆ max_page

int searx.engines.bing.max_page = 200

Definition at line 54 of file bing.py.

◆ paging

bool searx.engines.bing.paging = True

Definition at line 53 of file bing.py.

◆ safesearch

bool searx.engines.bing.safesearch = True

Definition at line 58 of file bing.py.

◆ time_range_support

bool searx.engines.bing.time_range_support = True

Definition at line 57 of file bing.py.