I don't know if you have used MongoDB in your work projects and in which scenarios.MongoDB as a NoSQL database, unlike SQL databases, you can use the Mybatis framework.
If you need to use MongoDB in SpringBoot, I currently know of three ways, the first is to directly use the official MongoDB SDK, the second is to use the SpringJpa way, the third is to use MongoTemplate. the second in the internal is also the use of MongoTemplate way, but just encapsulate some common CRUD operations , MongoTemplate is also the official SDK operation encapsulation , in fact, essentially there is no difference.
I work in the project, in the cloud storage and IM system have used MongoDB, MongoTemplate and SpringJpa have been used, but SpringJpa is not particularly good to use, at the same time also stepped on a lot of pitfalls, the following is to look at the advanced use of MongoDB in SpringBoot.
Public: Backend Essay
MongoDB annotations
Spring Data MongoDB provides a lot of annotations to simplify the simplify operation, these annotations include @Id, @Document, @Field and so on. cap (a poem)
found in the package. These annotations are used to instruct SpringBoot how to map Java objects to MongoDB's Document.
-
@Id: This annotation is used to specify which field is used as the primary key, and can be used in conjunction with the @Field field
@Id @Field(value = "_id", targetType = ) private String userId;// Use the userId field as the primary key, stored in Mongodb with the field name _id
-
@Field: This annotation is used to specify the name of the field in the Document. By default, Spring will use the name of the field of the Java object as the name of the field in the Document. If you want the name of the field in the Document to be different from the name of the field in the Java object, then you can use this annotation to specify.
-
@Document: used to map a Java class to a collection in MongoDB. By default, Spring uses the class name as the Collection name, but you can use this annotation to customize the Collection name.
listener
When using MongoTemplate for CRUD operations , will trigger a number of different kinds of listeners , we can create different types of listeners , so that the query conditions , delete conditions , Document mapping , etc. to modify , logging , performance optimization and so on.
These seven listeners above, all of which are controlled by the#onApplicationEvent
method is triggered, creating a listener is also very simple, you just need to create a class inheriting from theAbstractMongoEventListener
The class is then rewritten according to the CRUD operation performed, and finally the class is placed into the Spring container on it, there can be multiple listeners. Here are some basic uses of listeners:
Setting the Primary Key Value
MongoDB does not specify the_id
field, then MongoDB automatically generates a value of type ObjectId as the value of the_id
field value, but the default type generated is String. If we need to use int, long type as the_id
field type, then it must be set manually before performing the final insert.
If you don't want to manually set the value of the primary key field in the object every time you perform an insert operation, you can set the value of the primary key field in the#onBeforeConvert
The methods are unified to assign values to primary key fields in Java objects, such as using uuid, snowflake algorithm, etc. to automatically generate a unique primary key value.
@Override
public void onBeforeConvert(BeforeConvertEvent<Object> event) {
Object source = ();
if (!(source instanceof MongoBaseDomain)) {
return;
}
MongoBaseDomain<?> mongoBaseDomain = (MongoBaseDomain<?>) source;
if (() != null) {
return;
}
// according toidType of field,as ifLong,String,Integer,Automatically generate a unique primary key value
(idValue);
}
Logging
onBeforeSave
,onBeforeDelete
method will be triggered before the execution of remove and save, we can record the deletion condition and the final saved object in these two methods respectively, for the update method, I didn't find any method.
In Mybatis can record the execution of SQL, in MongoTemplate, we can also be realized through the listener. However, it should be noted that the MongoTemplate provides only seven trigger methods, if the execution of the aggregate, bulk and other operations, can not be recorded through the listener to the final execution of the operation statement.
Remove_class
By default, when saving Java objects to MongoDB, MongoTemplate adds an extra_class
field is used to hold the fully qualified name of this Java object.
When performing a query operation, MongoTemplate also adds to the query criteria the{_class: {$in: [java fully-qualified name, and fully-qualified names of subclasses]}}
. Note that the additional additions to the query conditions and the original conditions areand
operation, normally there is no problem, but if we manually specify the CollectionName when inserting, using Map as the inserted object, then MongoTemplate does not add the Document_class
fields (MongoTemplate does nothing with Map; Document itself is a subclass of Map).
In this case, when we execute the query conditions (modification, deletion, query based on the conditions), we may not be able to query the situation, the root cause is the use of Map inserted in this Document does not have the_class
Fields.
There are two solutions: 1. Remove_class
For insertion using a Map, manually set the number of items in the Map object to be inserted into the Map object._class
field's value, both of which have their advantages.
I prefer to remove_class
. If the fully qualified name of a Java object is long and there is a lot of data in the Collection, each save sets the_class
that will inevitably lead to unnecessary waste of storage space and_class
The role is simply to inform Spring which Java class this Document saved in MongoDB needs to be deserialized to.
Normally, we don't store multiple different Java types in the same Collection, so storing each Document in the_class
is completely unnecessary.
Does deserialization still work after removing _class from Document?
Determining which Java object the Document should be deserialized into is done in the#readType(S, <T>)
method, the default behavior is to get from the queried Document the_class
The value of the field is then compared to the value of thefind(Query query, Class<T> entityClass)
The final decision is to use the Document's entityClass._class
Or entityClass.
As I said above, we don't usually keep multiple different Java objects in the same Collection, so you can just use entityClass as the deserialization type.
/**
* Returns a more specific type based on source and basicType (source comes from database data), default behavior is to get _class from source and get it from cache based on fully qualified name, * since types are specified directly from mongoTemplate, the class from {@link TypeInformation#getType()} is the most specific type.
* Since the types are specified directly from the mongoTemplate, the class from {@link TypeInformation#getType()} is the most specific type.
* @param source must not be mongoTemplate-specific.
* @param source must not be {@literal null}.
* @param basicType must not be {@literal null}.
* @return Type information
*/
@Override
public <T> TypeInformation<? extends T> readType(Bson source, TypeInformation<T> basicType) {
Class<T> entityClass = ();
// If entityClass is null, pass the execution logic to the parent class, or get the Java class of the collection from the local cache based on the collectionName.
if (entityClass == null) {
return (source, basicType); }
}
ClassTypeInformation<? > targetType = (entityClass); return (targetType); } classTypeInformation<?
return (targetType); }
}
write operation
By default, the#writeType(<?>, S)
method to the Document_class
field, we need to remove the_class
field, just let the method do nothing
/**
* The default behavior is on write operations,towarddocumentadd{_class: "name of the whole series"}
*
* @param info must not be {@literal null}.
* @param sink must not be {@literal null}.
*/
@Override
public void writeType(TypeInformation<?> info, Bson sink) {}
consult (a document etc)
By default, thewriteTypeRestrictions(Document result, @Nullable Set<Class<?>> restrictedTypes)
method to the query condition{_class: {$in:[]}}
This will result in the absence of_class
The solution to the query error when the field is rewritten is also to rewrite thewriteTypeRestrictions
Methods to make it do nothing
/**
* The default line is at the time of the query,Write to the statement{_class: {$in: []}}
*
* @param result must not be {@literal null}
* @param restrictedTypes must not be {@literal null}
*/
@Override
public void writeTypeRestrictions(Document result, Set<Class<?>> restrictedTypes) {}
primary key
In MongoDB, primary key field names are fixed_id
By default, MongoDB automatically generates a value of type ObjectId as the value of _id if the value of the primary key field is not specified during insertion.
When using MongoTemplate to perform an insert operation, you can also do what Mybatis does, which is if the primary key value is missing in the object, then after a successful save, MongoTemplate will take the automatically generated MongoDB_id
value is assigned to a Java object in the@Id
annotation-modified field values.
User user = new User();
("xcye");
("xcye");
User insert = (user);
// = xxxx
If you need MongoTemplate to set the value of the id field automatically, you must ensure that the id field is of typeObjectId
, String
,BigInteger
Otherwise, an exception will be thrown during insertion, for details on how to determine this see#assertUpdateableIdIfNotSet
。
Custom_id converter
This is a pitfall, if User, the Collection, uses userId as the_id
The value of the field, which is a string. When we query, modify, or delete by userId, there may be a situation where the corresponding record is not queried, but the userId we passed in does exist, and this situation only exists for some of the userIds.
The reason this happens is because, when MongoTemplate is executed, the incoming_id
field is inferred, and it determines whether this passed-in_id
whether it is of type ObjectId, and if it can be converted to ObjectId, then MongoTemplate will use the ObjectId object as the_id
values, but because the MongoDB_id
The type of the field is a normal string, not an ObjectId, so there will be no query.
_id(corresponds to userId in Java object): String | username: String | password: String |
---|---|---|
66aeeb73142fcf1d5591c29c | xcye | 123456 |
The query condition we passed in:
({_id: "66aeeb73142fcf1d5591c29c"})
When MongoTemplate executes, it infers that 66aeeb73142fcf1d5591c29c is able to be converted to ObjectId type, so the final query condition becomes:
({_id: new ObjectId("66aeeb73142fcf1d5591c29c")})
This process is carried out inMongoConverter#convertId
method done in the
default Object convertId(@Nullable Object id, Class<?> targetType) {
if (id == null || (targetType, id)) {
return id;
}
// Springdeduce66aeeb73142fcf1d5591c29cCan be converted toObjectId,as a resulttargetTypebecause ofObjectId
if ((, targetType)) {
if (id instanceof String) {
// 字符串被转because of了ObjectId
if ((())) {
return new ObjectId(());
}
// avoid ConversionException as convertToMongoType will return String anyways.
return id;
}
}
try {
return getConversionService().canConvert((), targetType)
? getConversionService().convert(id, targetType)
: convertToMongoType(id, (TypeInformation<?>) null);
} catch (ConversionException o_O) {
return convertToMongoType(id,(TypeInformation<?>) null);
}
}
So in order to avoid a normal string being converted to an ObjectId, we need to override the convertId method. Just create a class inheriting from MappingMongoConverter class and override convertId in it.
@AutoConfiguration(after = , before = )
@ConditionalOnSingleCandidate()
public class MongoAutoConfiguration {
@Bean
@ConditionalOnMissingBean()
MappingMongoConverter mappingMongoConverter(MongoDatabaseFactory factory, MongoMappingContext context,
MongoCustomConversions conversions) {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(factory);
//
MappingMongoConverter mappingConverter = new MappingMongoConverter(dbRefResolver, context) {
@Override
public Object convertId(Object id, Class<?> targetType) {
if (id == null) {
return null;
}
if (id instanceof String) {
return id;
}
// Other conversions
}
};
(conversions);
return mappingConverter;
}
}
Automatic database switching
When using MongoTemplate operations, we can dynamically switch MongoDB databases, this feature is very useful in the scenario of sub-database, dynamically switching MongoDB databases is done through theMongoDatabaseFactorySupport
to get it done.
MongoTemplate is called on each execution of the#doGetDatabase
To get the database for the operation, we just need to create our ownMongoDatabaseFactory
If you want to return the database you are working on in the getMongoDatabase method, just refer to theSimpleMongoClientDatabaseFactory
。
@AutoConfiguration(after = , before = )
@ConditionalOnSingleCandidate()
public class MongoAutoConfiguration {
@Bean
MongoDatabaseFactorySupport<?> mongoDatabaseFactory(MongoClient mongoClient, MongoProperties properties) {
return new CustomMongoDatabaseFactory(mongoClient, ());
}
}
Because MongoDB is a NoSQL database, you don't need the database and database tables to exist for it to operate like a SQL database.When MongoDB executes, if the database or Collection doesn't exist, then it will be created automatically.