.oO SearXNG Developer Documentation Oo.
Loading...
Searching...
No Matches
searx.search.processors.abstract.EngineProcessor Class Reference
Inheritance diagram for searx.search.processors.abstract.EngineProcessor:
Collaboration diagram for searx.search.processors.abstract.EngineProcessor:

Public Member Functions

 __init__ (self, "Engine|ModuleType" engine, str engine_name)
 initialize (self)
 has_initialize_function (self)
 handle_exception (self, result_container, exception_or_message, suspend=False)
 extend_container (self, result_container, start_time, search_results)
 extend_container_if_suspended (self, result_container)
dict[str, t.Any] get_params (self, search_query, engine_category)
 search (self, query, params, result_container, start_time, timeout_limit)
 get_tests (self)
 get_default_tests (self)

Public Attributes

 engine = engine
str engine_name = engine_name
logging.Logger logger = engines[engine_name].logger
SuspendedStatus suspended_status = SUSPENDED_STATUS.setdefault(key, SuspendedStatus())

Protected Member Functions

 _extend_container_basic (self, result_container, start_time, search_results)

Static Private Attributes

str __slots__ = 'engine', 'engine_name', 'suspended_status', 'logger'

Detailed Description

Base classes used for all types of request processors.

Definition at line 63 of file abstract.py.

Constructor & Destructor Documentation

◆ __init__()

searx.search.processors.abstract.EngineProcessor.__init__ ( self,
"Engine|ModuleType" engine,
str engine_name )

Definition at line 68 of file abstract.py.

68 def __init__(self, engine: "Engine|ModuleType", engine_name: str):
69 self.engine: "Engine" = engine
70 self.engine_name: str = engine_name
71 self.logger: logging.Logger = engines[engine_name].logger
72 key = get_network(self.engine_name)
73 key = id(key) if key else self.engine_name
74 self.suspended_status: SuspendedStatus = SUSPENDED_STATUS.setdefault(key, SuspendedStatus())
75

Member Function Documentation

◆ _extend_container_basic()

searx.search.processors.abstract.EngineProcessor._extend_container_basic ( self,
result_container,
start_time,
search_results )
protected

Definition at line 113 of file abstract.py.

113 def _extend_container_basic(self, result_container, start_time, search_results):
114 # update result_container
115 result_container.extend(self.engine_name, search_results)
116 engine_time = default_timer() - start_time
117 page_load_time = get_time_for_thread()
118 result_container.add_timing(self.engine_name, engine_time, page_load_time)
119 # metrics
120 counter_inc('engine', self.engine_name, 'search', 'count', 'successful')
121 histogram_observe(engine_time, 'engine', self.engine_name, 'time', 'total')
122 if page_load_time is not None:
123 histogram_observe(page_load_time, 'engine', self.engine_name, 'time', 'http')
124

References engine_name.

Referenced by extend_container().

Here is the caller graph for this function:

◆ extend_container()

searx.search.processors.abstract.EngineProcessor.extend_container ( self,
result_container,
start_time,
search_results )

Definition at line 125 of file abstract.py.

125 def extend_container(self, result_container, start_time, search_results):
126 if getattr(threading.current_thread(), '_timeout', False):
127 # the main thread is not waiting anymore
128 self.handle_exception(result_container, 'timeout', None)
129 else:
130 # check if the engine accepted the request
131 if search_results is not None:
132 self._extend_container_basic(result_container, start_time, search_results)
133 self.suspended_status.resume()
134

References _extend_container_basic(), handle_exception(), and suspended_status.

Referenced by searx.search.processors.offline.OfflineProcessor.search(), and searx.search.processors.online.OnlineProcessor.search().

Here is the call graph for this function:
Here is the caller graph for this function:

◆ extend_container_if_suspended()

searx.search.processors.abstract.EngineProcessor.extend_container_if_suspended ( self,
result_container )

Definition at line 135 of file abstract.py.

135 def extend_container_if_suspended(self, result_container):
136 if self.suspended_status.is_suspended:
137 result_container.add_unresponsive_engine(
138 self.engine_name, self.suspended_status.suspend_reason, suspended=True
139 )
140 return True
141 return False
142

References engine_name, and suspended_status.

◆ get_default_tests()

searx.search.processors.abstract.EngineProcessor.get_default_tests ( self)

◆ get_params()

dict[str, t.Any] searx.search.processors.abstract.EngineProcessor.get_params ( self,
search_query,
engine_category )
Returns a set of (see :ref:`request params <engine request arguments>`) or
``None`` if request is not supported.

Not supported conditions (``None`` is returned):

- A page-number > 1 when engine does not support paging.
- A time range when the engine does not support time range.

Reimplemented in searx.search.processors.online.OnlineProcessor, searx.search.processors.online_currency.OnlineCurrencyProcessor, searx.search.processors.online_dictionary.OnlineDictionaryProcessor, and searx.search.processors.online_url_search.OnlineUrlSearchProcessor.

Definition at line 143 of file abstract.py.

143 def get_params(self, search_query, engine_category) -> dict[str, t.Any]:
144 """Returns a set of (see :ref:`request params <engine request arguments>`) or
145 ``None`` if request is not supported.
146
147 Not supported conditions (``None`` is returned):
148
149 - A page-number > 1 when engine does not support paging.
150 - A time range when the engine does not support time range.
151 """
152 # if paging is not supported, skip
153 if search_query.pageno > 1 and not self.engine.paging:
154 return None
155
156 # if max page is reached, skip
157 max_page = self.engine.max_page or settings['search']['max_page']
158 if max_page and max_page < search_query.pageno:
159 return None
160
161 # if time_range is not supported, skip
162 if search_query.time_range and not self.engine.time_range_support:
163 return None
164
165 params = {}
166 params["query"] = search_query.query
167 params['category'] = engine_category
168 params['pageno'] = search_query.pageno
169 params['safesearch'] = search_query.safesearch
170 params['time_range'] = search_query.time_range
171 params['engine_data'] = search_query.engine_data.get(self.engine_name, {})
172 params['searxng_locale'] = search_query.lang
173
174 # deprecated / vintage --> use params['searxng_locale']
175 #
176 # Conditions related to engine's traits are implemented in engine.traits
177 # module. Don't do 'locale' decisions here in the abstract layer of the
178 # search processor, just pass the value from user's choice unchanged to
179 # the engine request.
180
181 if hasattr(self.engine, 'language') and self.engine.language:
182 params['language'] = self.engine.language
183 else:
184 params['language'] = search_query.lang
185
186 return params
187

References searx.result_types._base.LegacyResult.engine, searx.result_types._base.MainResult.engine, searx.result_types._base.Result.engine, engine, and engine_name.

◆ get_tests()

searx.search.processors.abstract.EngineProcessor.get_tests ( self)

Definition at line 192 of file abstract.py.

192 def get_tests(self):
193 tests = getattr(self.engine, 'tests', None)
194 if tests is None:
195 tests = getattr(self.engine, 'additional_tests', {})
196 tests.update(self.get_default_tests())
197 return tests
198

◆ handle_exception()

searx.search.processors.abstract.EngineProcessor.handle_exception ( self,
result_container,
exception_or_message,
suspend = False )

Definition at line 90 of file abstract.py.

90 def handle_exception(self, result_container, exception_or_message, suspend=False):
91 # update result_container
92 if isinstance(exception_or_message, BaseException):
93 exception_class = exception_or_message.__class__
94 module_name = getattr(exception_class, '__module__', 'builtins')
95 module_name = '' if module_name == 'builtins' else module_name + '.'
96 error_message = module_name + exception_class.__qualname__
97 else:
98 error_message = exception_or_message
99 result_container.add_unresponsive_engine(self.engine_name, error_message)
100 # metrics
101 counter_inc('engine', self.engine_name, 'search', 'count', 'error')
102 if isinstance(exception_or_message, BaseException):
103 count_exception(self.engine_name, exception_or_message)
104 else:
105 count_error(self.engine_name, exception_or_message)
106 # suspend the engine ?
107 if suspend:
108 suspended_time = None
109 if isinstance(exception_or_message, SearxEngineAccessDeniedException):
110 suspended_time = exception_or_message.suspended_time
111 self.suspended_status.suspend(suspended_time, error_message) # pylint: disable=no-member
112

References engine_name, and suspended_status.

Referenced by extend_container(), and searx.search.processors.online.OnlineProcessor.search().

Here is the caller graph for this function:

◆ has_initialize_function()

searx.search.processors.abstract.EngineProcessor.has_initialize_function ( self)

Definition at line 87 of file abstract.py.

87 def has_initialize_function(self):
88 return hasattr(self.engine, 'init')
89

References searx.result_types._base.LegacyResult.engine, searx.result_types._base.MainResult.engine, searx.result_types._base.Result.engine, and engine.

◆ initialize()

searx.search.processors.abstract.EngineProcessor.initialize ( self)

Reimplemented in searx.search.processors.online.OnlineProcessor, and searx.search.processors.online_currency.OnlineCurrencyProcessor.

Definition at line 76 of file abstract.py.

76 def initialize(self):
77 try:
78 self.engine.init(get_engine_from_settings(self.engine_name))
79 except SearxEngineResponseException as exc:
80 self.logger.warning('Fail to initialize // %s', exc)
81 except Exception: # pylint: disable=broad-except
82 self.logger.exception('Fail to initialize')
83 else:
84 self.logger.debug('Initialized')
85

References searx.result_types._base.LegacyResult.engine, searx.result_types._base.MainResult.engine, searx.result_types._base.Result.engine, engine, engine_name, searx.enginelib.Engine.logger, and logger.

◆ search()

searx.search.processors.abstract.EngineProcessor.search ( self,
query,
params,
result_container,
start_time,
timeout_limit )

Reimplemented in searx.search.processors.offline.OfflineProcessor, and searx.search.processors.online.OnlineProcessor.

Definition at line 189 of file abstract.py.

189 def search(self, query, params, result_container, start_time, timeout_limit):
190 pass
191

Member Data Documentation

◆ __slots__

str searx.search.processors.abstract.EngineProcessor.__slots__ = 'engine', 'engine_name', 'suspended_status', 'logger'
staticprivate

Definition at line 66 of file abstract.py.

◆ engine

◆ engine_name

◆ logger

logging.Logger searx.search.processors.abstract.EngineProcessor.logger = engines[engine_name].logger

◆ suspended_status

SuspendedStatus searx.search.processors.abstract.EngineProcessor.suspended_status = SUSPENDED_STATUS.setdefault(key, SuspendedStatus())

Definition at line 74 of file abstract.py.

Referenced by extend_container(), extend_container_if_suspended(), and handle_exception().


The documentation for this class was generated from the following file:
  • /home/andrew/Documents/code/public/searxng/searx/search/processors/abstract.py