.oO SearXNG Developer Documentation Oo.
Loading...
Searching...
No Matches
searx.engines.bing Namespace Reference

Functions

 _page_offset (pageno)
 set_bing_cookies (params, engine_language, engine_region)
 request (query, params)
 response (resp)
 fetch_traits (EngineTraits engine_traits)

Variables

 logger = logging.getLogger()
dict about
list categories = ['general', 'web']
bool paging = True
int max_page = 200
bool time_range_support = True
bool safesearch = True
str base_url = 'https://www.bing.com/search'

Detailed Description

This is the implementation of the Bing-WEB engine. Some of this
implementations are shared by other engines:

- :ref:`bing images engine`
- :ref:`bing news engine`
- :ref:`bing videos engine`

On the `preference page`_ Bing offers a lot of languages an regions (see section
LANGUAGE and COUNTRY/REGION).  The Language is the language of the UI, we need
in SearXNG to get the translations of data such as *"published last week"*.

There is a description of the official search-APIs_, unfortunately this is not
the API we can use or that bing itself would use.  You can look up some things
in the API to get a better picture of bing, but the value specifications like
the market codes are usually outdated or at least no longer used by bing itself.

The market codes have been harmonized and are identical for web, video and
images.  The news area has also been harmonized with the other categories.  Only
political adjustments still seem to be made -- for example, there is no news
category for the Chinese market.

.. _preference page: https://www.bing.com/account/general
.. _search-APIs: https://learn.microsoft.com/en-us/bing/search-apis/

Function Documentation

◆ _page_offset()

searx.engines.bing._page_offset ( pageno)
protected

Definition at line 75 of file bing.py.

75def _page_offset(pageno):
76 return (int(pageno) - 1) * 10 + 1
77
78

Referenced by request(), and response().

Here is the caller graph for this function:

◆ fetch_traits()

searx.engines.bing.fetch_traits ( EngineTraits engine_traits)
Fetch languages and regions from Bing-Web.

Definition at line 197 of file bing.py.

197def fetch_traits(engine_traits: EngineTraits):
198 """Fetch languages and regions from Bing-Web."""
199 # pylint: disable=import-outside-toplevel
200
201 from searx.network import get # see https://github.com/searxng/searxng/issues/762
202 from searx.utils import gen_useragent
203
204 headers = {
205 "User-Agent": gen_useragent(),
206 "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8",
207 "Accept-Language": "en-US;q=0.5,en;q=0.3",
208 "Accept-Encoding": "gzip, deflate, br",
209 "DNT": "1",
210 "Connection": "keep-alive",
211 "Upgrade-Insecure-Requests": "1",
212 "Sec-GPC": "1",
213 "Cache-Control": "max-age=0",
214 }
215
216 resp = get("https://www.bing.com/account/general", headers=headers)
217 if not resp.ok: # type: ignore
218 print("ERROR: response from bing is not OK.")
219
220 dom = html.fromstring(resp.text) # type: ignore
221
222 # languages
223
224 engine_traits.languages['zh'] = 'zh-hans'
225
226 map_lang = {'prs': 'fa-AF', 'en': 'en-us'}
227 bing_ui_lang_map = {
228 # HINT: this list probably needs to be supplemented
229 'en': 'us', # en --> en-us
230 'da': 'dk', # da --> da-dk
231 }
232
233 for href in eval_xpath(dom, '//div[@id="language-section-content"]//div[@class="languageItem"]/a/@href'):
234 eng_lang = parse_qs(urlparse(href).query)['setlang'][0]
235 babel_lang = map_lang.get(eng_lang, eng_lang)
236 try:
237 sxng_tag = language_tag(babel.Locale.parse(babel_lang.replace('-', '_')))
238 except babel.UnknownLocaleError:
239 print("ERROR: language (%s) is unknown by babel" % (babel_lang))
240 continue
241 # Language (e.g. 'en' or 'de') from https://www.bing.com/account/general
242 # is converted by bing to 'en-us' or 'de-de'. But only if there is not
243 # already a '-' delemitter in the language. For instance 'pt-PT' -->
244 # 'pt-pt' and 'pt-br' --> 'pt-br'
245 bing_ui_lang = eng_lang.lower()
246 if '-' not in bing_ui_lang:
247 bing_ui_lang = bing_ui_lang + '-' + bing_ui_lang_map.get(bing_ui_lang, bing_ui_lang)
248
249 conflict = engine_traits.languages.get(sxng_tag)
250 if conflict:
251 if conflict != bing_ui_lang:
252 print(f"CONFLICT: babel {sxng_tag} --> {conflict}, {bing_ui_lang}")
253 continue
254 engine_traits.languages[sxng_tag] = bing_ui_lang
255
256 # regions (aka "market codes")
257
258 engine_traits.regions['zh-CN'] = 'zh-cn'
259
260 map_market_codes = {
261 'zh-hk': 'en-hk', # not sure why, but at M$ this is the market code for Hongkong
262 }
263 for href in eval_xpath(dom, '//div[@id="region-section-content"]//div[@class="regionItem"]/a/@href'):
264 cc_tag = parse_qs(urlparse(href).query)['cc'][0]
265 if cc_tag == 'clear':
266 engine_traits.all_locale = cc_tag
267 continue
268
269 # add market codes from official languages of the country ..
270 for lang_tag in babel.languages.get_official_languages(cc_tag, de_facto=True):
271 if lang_tag not in engine_traits.languages.keys():
272 # print("ignore lang: %s <-- %s" % (cc_tag, lang_tag))
273 continue
274 lang_tag = lang_tag.split('_')[0] # zh_Hant --> zh
275 market_code = f"{lang_tag}-{cc_tag}" # zh-tw
276
277 market_code = map_market_codes.get(market_code, market_code)
278 sxng_tag = region_tag(babel.Locale.parse('%s_%s' % (lang_tag, cc_tag.upper())))
279 conflict = engine_traits.regions.get(sxng_tag)
280 if conflict:
281 if conflict != market_code:
282 print("CONFLICT: babel %s --> %s, %s" % (sxng_tag, conflict, market_code))
283 continue
284 engine_traits.regions[sxng_tag] = market_code

◆ request()

searx.engines.bing.request ( query,
params )
Assemble a Bing-Web request.

Definition at line 85 of file bing.py.

85def request(query, params):
86 """Assemble a Bing-Web request."""
87
88 engine_region = traits.get_region(params['searxng_locale'], traits.all_locale) # type: ignore
89 engine_language = traits.get_language(params['searxng_locale'], 'en') # type: ignore
90 set_bing_cookies(params, engine_language, engine_region)
91
92 page = params.get('pageno', 1)
93 query_params = {
94 'q': query,
95 # if arg 'pq' is missed, sometimes on page 4 we get results from page 1,
96 # don't ask why it is only sometimes / its M$ and they have never been
97 # deterministic ;)
98 'pq': query,
99 }
100
101 # To get correct page, arg first and this arg FORM is needed, the value PERE
102 # is on page 2, on page 3 its PERE1 and on page 4 its PERE2 .. and so forth.
103 # The 'first' arg should never send on page 1.
104
105 if page > 1:
106 query_params['first'] = _page_offset(page) # see also arg FORM
107 if page == 2:
108 query_params['FORM'] = 'PERE'
109 elif page > 2:
110 query_params['FORM'] = 'PERE%s' % (page - 2)
111
112 params['url'] = f'{base_url}?{urlencode(query_params)}'
113
114 if params.get('time_range'):
115 unix_day = int(time.time() / 86400)
116 time_ranges = {'day': '1', 'week': '2', 'month': '3', 'year': f'5_{unix_day-365}_{unix_day}'}
117 params['url'] += f'&filters=ex1:"ez{time_ranges[params["time_range"]]}"'
118
119 return params
120
121

References _page_offset(), and set_bing_cookies().

Here is the call graph for this function:

◆ response()

searx.engines.bing.response ( resp)

Definition at line 122 of file bing.py.

122def response(resp):
123 # pylint: disable=too-many-locals
124
125 results = []
126 result_len = 0
127
128 dom = html.fromstring(resp.text)
129
130 # parse results again if nothing is found yet
131
132 for result in eval_xpath_list(dom, '//ol[@id="b_results"]/li[contains(@class, "b_algo")]'):
133
134 link = eval_xpath_getindex(result, './/h2/a', 0, None)
135 if link is None:
136 continue
137 url = link.attrib.get('href')
138 title = extract_text(link)
139
140 content = eval_xpath(result, './/p')
141 for p in content:
142 # Make sure that the element is free of:
143 # <span class="algoSlug_icon" # data-priority="2">Web</span>
144 for e in p.xpath('.//span[@class="algoSlug_icon"]'):
145 e.getparent().remove(e)
146 content = extract_text(content)
147
148 # get the real URL
149 if url.startswith('https://www.bing.com/ck/a?'):
150 # get the first value of u parameter
151 url_query = urlparse(url).query
152 parsed_url_query = parse_qs(url_query)
153 param_u = parsed_url_query["u"][0]
154 # remove "a1" in front
155 encoded_url = param_u[2:]
156 # add padding
157 encoded_url = encoded_url + '=' * (-len(encoded_url) % 4)
158 # decode base64 encoded URL
159 url = base64.urlsafe_b64decode(encoded_url).decode()
160
161 # append result
162 results.append({'url': url, 'title': title, 'content': content})
163
164 # get number_of_results
165 if results:
166 result_len_container = "".join(eval_xpath(dom, '//span[@class="sb_count"]//text()'))
167 if "-" in result_len_container:
168 start_str, result_len_container = re.split(r'-\d+', result_len_container)
169 start = int(start_str)
170 else:
171 start = 1
172
173 result_len_container = re.sub('[^0-9]', '', result_len_container)
174 if len(result_len_container) > 0:
175 result_len = int(result_len_container)
176
177 expected_start = _page_offset(resp.search_params.get("pageno", 1))
178
179 if expected_start != start:
180 if expected_start > result_len:
181 # Avoid reading more results than available.
182 # For example, if there is 100 results from some search and we try to get results from 120 to 130,
183 # Bing will send back the results from 0 to 10 and no error.
184 # If we compare results count with the first parameter of the request we can avoid this "invalid"
185 # results.
186 return []
187
188 # Sometimes Bing will send back the first result page instead of the requested page as a rate limiting
189 # measure.
190 msg = f"Expected results to start at {expected_start}, but got results starting at {start}"
191 raise SearxEngineAPIException(msg)
192
193 results.append({'number_of_results': result_len})
194 return results
195
196

References _page_offset().

Here is the call graph for this function:

◆ set_bing_cookies()

searx.engines.bing.set_bing_cookies ( params,
engine_language,
engine_region )

Definition at line 79 of file bing.py.

79def set_bing_cookies(params, engine_language, engine_region):
80 params['cookies']['_EDGE_CD'] = f'm={engine_region}&u={engine_language}'
81 params['cookies']['_EDGE_S'] = f'mkt={engine_region}&ui={engine_language}'
82 logger.debug("bing cookies: %s", params['cookies'])
83
84

Referenced by request().

Here is the caller graph for this function:

Variable Documentation

◆ about

dict searx.engines.bing.about
Initial value:
1= {
2 "website": 'https://www.bing.com',
3 "wikidata_id": 'Q182496',
4 "official_api_documentation": 'https://www.microsoft.com/en-us/bing/apis/bing-web-search-api',
5 "use_official_api": False,
6 "require_api_key": False,
7 "results": 'HTML',
8}

Definition at line 50 of file bing.py.

◆ base_url

str searx.engines.bing.base_url = 'https://www.bing.com/search'

Definition at line 71 of file bing.py.

◆ categories

list searx.engines.bing.categories = ['general', 'web']

Definition at line 60 of file bing.py.

◆ logger

searx.engines.bing.logger = logging.getLogger()

Definition at line 46 of file bing.py.

◆ max_page

int searx.engines.bing.max_page = 200

Definition at line 62 of file bing.py.

◆ paging

bool searx.engines.bing.paging = True

Definition at line 61 of file bing.py.

◆ safesearch

bool searx.engines.bing.safesearch = True

Definition at line 66 of file bing.py.

◆ time_range_support

bool searx.engines.bing.time_range_support = True

Definition at line 65 of file bing.py.