.oO SearXNG Developer Documentation Oo.
Loading...
Searching...
No Matches
searx.search.processors.abstract.EngineProcessor Class Reference
+ Inheritance diagram for searx.search.processors.abstract.EngineProcessor:
+ Collaboration diagram for searx.search.processors.abstract.EngineProcessor:

Public Member Functions

 __init__ (self, engine, str engine_name)
 
 initialize (self)
 
 has_initialize_function (self)
 
 handle_exception (self, result_container, exception_or_message, suspend=False)
 
 extend_container (self, result_container, start_time, search_results)
 
 extend_container_if_suspended (self, result_container)
 
 get_params (self, search_query, engine_category)
 
 search (self, query, params, result_container, start_time, timeout_limit)
 
 get_tests (self)
 
 get_default_tests (self)
 

Public Attributes

 engine = engine
 
 engine_name = engine_name
 
 logger = engines[engine_name].logger
 
 suspended_status = SUSPENDED_STATUS.setdefault(key, SuspendedStatus())
 

Protected Member Functions

 _extend_container_basic (self, result_container, start_time, search_results)
 

Static Private Attributes

str __slots__ = 'engine', 'engine_name', 'lock', 'suspended_status', 'logger'
 

Detailed Description

Base classes used for all types of request processors.

Definition at line 58 of file abstract.py.

Constructor & Destructor Documentation

◆ __init__()

searx.search.processors.abstract.EngineProcessor.__init__ ( self,
engine,
str engine_name )

Definition at line 63 of file abstract.py.

63 def __init__(self, engine, engine_name: str):
64 self.engine = engine
65 self.engine_name = engine_name
66 self.logger = engines[engine_name].logger
67 key = get_network(self.engine_name)
68 key = id(key) if key else self.engine_name
69 self.suspended_status = SUSPENDED_STATUS.setdefault(key, SuspendedStatus())
70

Member Function Documentation

◆ _extend_container_basic()

searx.search.processors.abstract.EngineProcessor._extend_container_basic ( self,
result_container,
start_time,
search_results )
protected

Definition at line 108 of file abstract.py.

108 def _extend_container_basic(self, result_container, start_time, search_results):
109 # update result_container
110 result_container.extend(self.engine_name, search_results)
111 engine_time = default_timer() - start_time
112 page_load_time = get_time_for_thread()
113 result_container.add_timing(self.engine_name, engine_time, page_load_time)
114 # metrics
115 counter_inc('engine', self.engine_name, 'search', 'count', 'successful')
116 histogram_observe(engine_time, 'engine', self.engine_name, 'time', 'total')
117 if page_load_time is not None:
118 histogram_observe(page_load_time, 'engine', self.engine_name, 'time', 'http')
119

References searx.search.processors.abstract.EngineProcessor.engine_name, searx.search.processors.offline.OfflineProcessor.engine_name, and searx.search.processors.online.OnlineProcessor.engine_name.

Referenced by searx.search.processors.abstract.EngineProcessor.extend_container().

+ Here is the caller graph for this function:

◆ extend_container()

searx.search.processors.abstract.EngineProcessor.extend_container ( self,
result_container,
start_time,
search_results )

Definition at line 120 of file abstract.py.

120 def extend_container(self, result_container, start_time, search_results):
121 if getattr(threading.current_thread(), '_timeout', False):
122 # the main thread is not waiting anymore
123 self.handle_exception(result_container, 'timeout', None)
124 else:
125 # check if the engine accepted the request
126 if search_results is not None:
127 self._extend_container_basic(result_container, start_time, search_results)
128 self.suspended_status.resume()
129

References searx.search.processors.abstract.EngineProcessor._extend_container_basic(), searx.search.processors.abstract.EngineProcessor.handle_exception(), and searx.search.processors.abstract.EngineProcessor.suspended_status.

Referenced by searx.search.processors.offline.OfflineProcessor.search(), and searx.search.processors.online.OnlineProcessor.search().

+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ extend_container_if_suspended()

searx.search.processors.abstract.EngineProcessor.extend_container_if_suspended ( self,
result_container )

Definition at line 130 of file abstract.py.

130 def extend_container_if_suspended(self, result_container):
131 if self.suspended_status.is_suspended:
132 result_container.add_unresponsive_engine(
133 self.engine_name, self.suspended_status.suspend_reason, suspended=True
134 )
135 return True
136 return False
137

References searx.search.processors.abstract.EngineProcessor.engine_name, searx.search.processors.offline.OfflineProcessor.engine_name, searx.search.processors.online.OnlineProcessor.engine_name, and searx.search.processors.abstract.EngineProcessor.suspended_status.

◆ get_default_tests()

searx.search.processors.abstract.EngineProcessor.get_default_tests ( self)

◆ get_params()

searx.search.processors.abstract.EngineProcessor.get_params ( self,
search_query,
engine_category )
Returns a set of (see :ref:`request params <engine request arguments>`) or
``None`` if request is not supported.

Not supported conditions (``None`` is returned):

- A page-number > 1 when engine does not support paging.
- A time range when the engine does not support time range.

Reimplemented in searx.search.processors.online.OnlineProcessor, searx.search.processors.online_currency.OnlineCurrencyProcessor, searx.search.processors.online_dictionary.OnlineDictionaryProcessor, and searx.search.processors.online_url_search.OnlineUrlSearchProcessor.

Definition at line 138 of file abstract.py.

138 def get_params(self, search_query, engine_category):
139 """Returns a set of (see :ref:`request params <engine request arguments>`) or
140 ``None`` if request is not supported.
141
142 Not supported conditions (``None`` is returned):
143
144 - A page-number > 1 when engine does not support paging.
145 - A time range when the engine does not support time range.
146 """
147 # if paging is not supported, skip
148 if search_query.pageno > 1 and not self.engine.paging:
149 return None
150
151 # if max page is reached, skip
152 max_page = self.engine.max_page or settings['search']['max_page']
153 if max_page and max_page < search_query.pageno:
154 return None
155
156 # if time_range is not supported, skip
157 if search_query.time_range and not self.engine.time_range_support:
158 return None
159
160 params = {}
161 params['category'] = engine_category
162 params['pageno'] = search_query.pageno
163 params['safesearch'] = search_query.safesearch
164 params['time_range'] = search_query.time_range
165 params['engine_data'] = search_query.engine_data.get(self.engine_name, {})
166 params['searxng_locale'] = search_query.lang
167
168 # deprecated / vintage --> use params['searxng_locale']
169 #
170 # Conditions related to engine's traits are implemented in engine.traits
171 # module. Don't do 'locale' decisions here in the abstract layer of the
172 # search processor, just pass the value from user's choice unchanged to
173 # the engine request.
174
175 if hasattr(self.engine, 'language') and self.engine.language:
176 params['language'] = self.engine.language
177 else:
178 params['language'] = search_query.lang
179
180 return params
181

References searx.search.processors.abstract.EngineProcessor.engine, searx.search.processors.online.OnlineProcessor.engine, searx.search.processors.online_dictionary.OnlineDictionaryProcessor.engine, searx.search.processors.abstract.EngineProcessor.engine_name, searx.search.processors.offline.OfflineProcessor.engine_name, and searx.search.processors.online.OnlineProcessor.engine_name.

◆ get_tests()

searx.search.processors.abstract.EngineProcessor.get_tests ( self)

Definition at line 186 of file abstract.py.

186 def get_tests(self):
187 tests = getattr(self.engine, 'tests', None)
188 if tests is None:
189 tests = getattr(self.engine, 'additional_tests', {})
190 tests.update(self.get_default_tests())
191 return tests
192

◆ handle_exception()

searx.search.processors.abstract.EngineProcessor.handle_exception ( self,
result_container,
exception_or_message,
suspend = False )

Definition at line 85 of file abstract.py.

85 def handle_exception(self, result_container, exception_or_message, suspend=False):
86 # update result_container
87 if isinstance(exception_or_message, BaseException):
88 exception_class = exception_or_message.__class__
89 module_name = getattr(exception_class, '__module__', 'builtins')
90 module_name = '' if module_name == 'builtins' else module_name + '.'
91 error_message = module_name + exception_class.__qualname__
92 else:
93 error_message = exception_or_message
94 result_container.add_unresponsive_engine(self.engine_name, error_message)
95 # metrics
96 counter_inc('engine', self.engine_name, 'search', 'count', 'error')
97 if isinstance(exception_or_message, BaseException):
98 count_exception(self.engine_name, exception_or_message)
99 else:
100 count_error(self.engine_name, exception_or_message)
101 # suspend the engine ?
102 if suspend:
103 suspended_time = None
104 if isinstance(exception_or_message, SearxEngineAccessDeniedException):
105 suspended_time = exception_or_message.suspended_time
106 self.suspended_status.suspend(suspended_time, error_message) # pylint: disable=no-member
107

References searx.search.processors.abstract.EngineProcessor.engine_name, searx.search.processors.offline.OfflineProcessor.engine_name, searx.search.processors.online.OnlineProcessor.engine_name, and searx.search.processors.abstract.EngineProcessor.suspended_status.

Referenced by searx.search.processors.abstract.EngineProcessor.extend_container(), and searx.search.processors.online.OnlineProcessor.search().

+ Here is the caller graph for this function:

◆ has_initialize_function()

searx.search.processors.abstract.EngineProcessor.has_initialize_function ( self)

Definition at line 82 of file abstract.py.

82 def has_initialize_function(self):
83 return hasattr(self.engine, 'init')
84

References searx.search.processors.abstract.EngineProcessor.engine, searx.search.processors.online.OnlineProcessor.engine, and searx.search.processors.online_dictionary.OnlineDictionaryProcessor.engine.

◆ initialize()

searx.search.processors.abstract.EngineProcessor.initialize ( self)

Reimplemented in searx.search.processors.online.OnlineProcessor.

Definition at line 71 of file abstract.py.

71 def initialize(self):
72 try:
73 self.engine.init(get_engine_from_settings(self.engine_name))
74 except SearxEngineResponseException as exc:
75 self.logger.warning('Fail to initialize // %s', exc)
76 except Exception: # pylint: disable=broad-except
77 self.logger.exception('Fail to initialize')
78 else:
79 self.logger.debug('Initialized')
80

References searx.search.processors.abstract.EngineProcessor.engine, searx.search.processors.online.OnlineProcessor.engine, searx.search.processors.online_dictionary.OnlineDictionaryProcessor.engine, searx.search.processors.abstract.EngineProcessor.engine_name, searx.search.processors.offline.OfflineProcessor.engine_name, searx.search.processors.online.OnlineProcessor.engine_name, and searx.search.processors.abstract.EngineProcessor.logger.

◆ search()

searx.search.processors.abstract.EngineProcessor.search ( self,
query,
params,
result_container,
start_time,
timeout_limit )

Reimplemented in searx.search.processors.offline.OfflineProcessor, and searx.search.processors.online.OnlineProcessor.

Definition at line 183 of file abstract.py.

183 def search(self, query, params, result_container, start_time, timeout_limit):
184 pass
185

Member Data Documentation

◆ __slots__

str searx.search.processors.abstract.EngineProcessor.__slots__ = 'engine', 'engine_name', 'lock', 'suspended_status', 'logger'
staticprivate

Definition at line 61 of file abstract.py.

◆ engine

◆ engine_name

◆ logger

◆ suspended_status


The documentation for this class was generated from the following file: