Recently, I gave my website (/) try to integrate the es to realize one of my search function, because this is my first time to understand the use of elastic, so if there is not the right place, you can point out, words do not say, first look at one of my general procedure
The version of the sdk I used here is, Version=8.0.0.0, the official URL isInstallation | Elasticsearch .NET Client [8.0] | Elastic
My es was initially intended to be deployed to ubuntu with my application, but when I finally installed kibana, there were all sorts of problems, so I had no choice but to install it on windows with my SqlServer, which is really a bit of a crime for a 2G server.
1. Configure es
Inside es, I turned on password authentication. Here is my configuration
"Search": { "IsEnable": "true", "Uri": "http://127.0.0.1:9200/", "User": "123", "Password": "123" }
Then add a new assembly
Then go inside ElasticsearchClient and write a constructor to configure es
using ; using ; using ; using ; using ; using ; namespace { public class ElasticSearchClient : IElasticSearchClient { private ElasticsearchClient elasticsearchClient; public ElasticSearchClient() { string uri = ("Search:Uri").Value; string username = ("Search:User").Value; string password = ("Search:Password").Value; var settings = new ElasticsearchClientSettings(new Uri(uri)) .Authentication(new BasicAuthentication(username, password)).DisableDirectStreaming(); elasticsearchClient = new ElasticsearchClient(settings); } public ElasticsearchClient GetClient() { return elasticsearchClient; } } }
Then we look at skd's official website and there is this this tip
The client application should create an instance of this, which is used throughout the application for the life of the application. Internally, the client manages and maintains HTTP connections to nodes, reusing them to optimize performance. If you are using dependency injection in the container, the client instance should be registered to the Single Instance Survival (SIS) registry.
So I'll just give it an AddSingleton.
using ; using ; namespace { public static class ConfigureSearchEngine { public static void AddSearchEngine(this IServiceCollection services) { <IElasticSearchClient, ElasticSearchClient>(); } } }
2、Submit articles and synchronize them to es
Then it's time to synchronize the articles to es, which I do by writing to the database and then synchronizing to rabbitmq, via the event bus (Based on EventBus EventBus to realize the email push function) Write to es
First define an es model
using ; using System; using ; using ; using ; using ; using ; namespace { [ElasticsearchIndex(IndexName ="t_article")]//Customized feature, which is not included in the sdk public class Article_ES { public long Id { get; set; } /// <summary> /// author /// </summary> public string Author { get; set; } /// <summary> /// caption /// </summary> public string Title { get; set; } /// <summary> /// tab (of a window) (computing) /// </summary> public string Tag { get; set; } /// <summary> /// summary /// </summary> public string Description { get; set; } /// <summary> /// element /// </summary> public string ArticleContent { get; set; } /// <summary> /// columns /// </summary> public long ArticleCategoryId { get; set; } /// <summary> /// Original or not /// </summary> public bool? IsOriginal { get; set; } /// <summary> /// Number of comments /// </summary> public int? CommentCount { get; set; } /// <summary> /// number of likes (on a website) /// </summary> public int? PraiseCount { get; set; } /// <summary> /// Views /// </summary> public int? BrowserCount { get; set; } /// <summary> /// Number of collections /// </summary> public int? CollectCount { get; set; } /// <summary> /// Creation time /// </summary> public DateTime CreateTime { get; set; } } }
Then create the index
string index = (typeof(Article_ES)); await ().<Article_ES>(index, s => ( x => ( t => (l => ) .Text(l=>,z=>(ik_max_word)) .Keyword(l=>) .Text(l=>,z=>(ik_max_word)) .Text(l=>,z=>(ik_max_word)) .Text(l=>,z=>(ik_max_word)) .LongNumber(l=>) .Boolean(l=>) .IntegerNumber(l=>) .IntegerNumber(l=>) .IntegerNumber(l=>) .IntegerNumber(l=>) .IntegerNumber(l=>) .Date(l=>) ) ) );
Then write to the mq every time you add, delete or change a post, for example
private async Task SendToMq(Article article, Operation operation) { ArticleEventData articleEventData = new ArticleEventData(); = operation; articleEventData.Article_ES = <Article, Article_ES>(article); TaskRecord taskRecord = new TaskRecord(); = CreateEntityId(); = ; = "Send Article"; = ; = (int); = (); = (articleEventData); await <TaskRecord>().InsertAsync(taskRecord); await (); try { (GetMqExchangeName(), , , articleEventData); } catch (Exception ex) { var taskRecordRepository = <TaskRecord>(); TaskRecord update = await (); = (int); = ; = "Send Failure"; = ; await (update); await (); } }
After the mq subscription write to es, specific additions, deletions and changes are not written, right?
3. Start query es
Waiting to write the article, start querying the article, here sdk provides a more complex method of querying, all through the lmbda a chain to splice, but I did not find a better way, so let's do it first!
First create a collection to hold the expression of the query
List<Action<QueryDescriptor<Article_ES>>> querys = new List<Action<QueryDescriptor<Article_ES>>>();
Then define a few fields that need to be queried
I'm using MultiMatch here to implement multiple fields matching the same query, and specifying the use of ik_smart participle
Field[] fields = { new Field("title"), new Field("tag"), new Field("articleContent"), new Field("description") }; (s => (y => ((fields)).Analyzer(ik_smart).Query(keyword).Type()));
Define the query result highlighting, add labels to the fields of the query that matches to the participle, while the front-end needs to handle the style of this.
Dictionary<Field, HighlightField> highlightFields = new Dictionary<Field, HighlightField>(); (new Field("title"), new HighlightField() { PreTags = new List<string> { "<em>" }, PostTags = new List<string> { "</em>" }, }); (new Field("description"), new HighlightField() { PreTags = new List<string> { "<em>" }, PostTags = new List<string> { "</em>" }, }); Highlight highlight = new Highlight() { Fields = highlightFields };
To make the query more efficient, I'll only look up some of the fields
SourceFilter sourceFilter = new SourceFilter(); = (new Field[] { "title", "id", "author", "description", "createTime", "browserCount", "commentCount" }); SourceConfig sourceConfig = new SourceConfig(sourceFilter); Action<SearchRequestDescriptor<Article_ES>> configureRequest = s => (index) .From(( - 1) * ) .Size() .Query(x => (y => (()))) .Source(sourceConfig) .Sort(y => (ht => , new FieldSort() { Order=}))
Get the result of the query's disambiguation
var analyzeIndexRequest = new AnalyzeIndexRequest { Text = new string[] { keyword }, Analyzer = analyzer }; var analyzeResponse = await (analyzeIndexRequest); if ( == null) return new string[0]; return (s => ).ToArray();
At this point, this is the approximate query result, complete as follows
public async Task<<Article_ES>> SelectArticle(HomeArticleCondition homeArticleCondition) { string keyword = (); bool isNumber = (keyword, ); List<Action<QueryDescriptor<Article_ES>>> querys = new List<Action<QueryDescriptor<Article_ES>>>(); if (isNumber) { (s => (x => ( should => (f => (z => ).Value(keyword)) , should => (f => (z => ).Value(keyword)) , should => (f => (z => ).Value(keyword)) ))); } else { Field[] fields = { new Field("title"), new Field("tag"), new Field("articleContent"), new Field("description") }; (s => (y => ((fields)).Analyzer(ik_smart).Query(keyword).Type())); } if () { (s => (t => (f => ).Value(()))); } string index = (typeof(Article_ES)); Dictionary<Field, HighlightField> highlightFields = new Dictionary<Field, HighlightField>(); (new Field("title"), new HighlightField() { PreTags = new List<string> { "<em>" }, PostTags = new List<string> { "</em>" }, }); (new Field("description"), new HighlightField() { PreTags = new List<string> { "<em>" }, PostTags = new List<string> { "</em>" }, }); Highlight highlight = new Highlight() { Fields = highlightFields }; SourceFilter sourceFilter = new SourceFilter(); = (new Field[] { "title", "id", "author", "description", "createTime", "browserCount", "commentCount" }); SourceConfig sourceConfig = new SourceConfig(sourceFilter); Action<SearchRequestDescriptor<Article_ES>> configureRequest = s => (index) .From(( - 1) * ) .Size() .Query(x => (y => (()))) .Source(sourceConfig) .Sort(y => (ht => , new FieldSort() { Order=})).Highlight(highlight); var resp = await ().SearchAsync<Article_ES>(configureRequest); foreach (var item in ) { if ( == null) continue; foreach (var dict in ) { switch () { case "title": = string.Join("...", ); break; case "description": = string.Join("...", ); break; } } } string[] analyzeWords = await (); List<Article_ES> articles = (); return new <Article_ES>(articles, analyzeWords); }
4、Demonstration effect
After finishing, release the deployment, look at the effect, the subtext here to do like Baidu, it is estimated that at present it is very difficult to see the
So here I also ask for advice on how to use SearchRequest to encapsulate multiple query conditions as follows
SearchRequest searchRequest = new SearchRequest();
= 0;
= 10;
= Multiple query conditions
Because I think the code reads a bit more readable this way than lambda, and allows for better dynamic encapsulation.