sourceName stringclasses 1 value | url stringclasses 20 values | action stringclasses 1 value | body stringlengths 23 1.11k | format stringclasses 1 value | metadata dict | title stringclasses 20 values | updated stringclasses 1 value | embedding listlengths 384 384 |
|---|---|---|---|---|---|---|---|---|
devcenter | https://www.mongodb.com/developer/products/atlas/atlas-search-cene-1 | created | # The Atlas Search 'cene: Season 1
# The Atlas Search 'cene: Season 1
Welcome to the first season of a video series dedicated to Atlas Search! This series of videos is designed to guide you through the journey from getting started and understanding the concepts, to advanced techniques.
## What is Atlas Search?
[Atlas Search][1] is an embedded full-text search in MongoDB Atlas that gives you a seamless, scalable experience for building relevance-based app features. Built on Apache Lucene, Atlas Search eliminates the need to run a separate search system alongside your database.
By integrating the database, search engine, and sync mechanism into a single, unified, and fully managed platform, Atlas Search is the fastest and easiest way to build relevance-based search capabilities directly into applications. | md | {
"tags": [
"Atlas"
],
"pageDescription": "The Atlas Search 'cene: Season 1",
"contentType": "Video"
} | The Atlas Search 'cene: Season 1 | 2024-05-20T17:32:23.500Z | [
-0.04333272948861122,
0.008474143221974373,
0.02093709260225296,
-0.013369143009185791,
0.021301070228219032,
0.00247253873385489,
0.021123070269823074,
0.01927790604531765,
0.024801388382911682,
-0.014641942456364632,
0.008706260472536087,
-0.01648327149450779,
0.04427856579422951,
0.0068... |
devcenter | https://www.mongodb.com/developer/products/atlas/atlas-search-cene-1 | created | > Hip to the *'cene*
>
> The name of this video series comes from a contraction of "Lucene",
> the search engine library leveraged by Atlas. Or it's a short form of "scene".
## Episode Guide
### **[Episode 1: What is Atlas Search & Quick Start][2]**
In this first episode of the Atlas Search 'cene, learn what Atlas Search is, and get a quick start introduction to setting up Atlas Search on your data. Within a few clicks, you can set up a powerful, full-text search index on your Atlas collection data, and leverage the fast, relevant results to your users queries.
### **[Episode 2: Configuration / Development Environment][3]**
In order to best leverage Atlas Search, configuring it for your querying needs leads to success. In this episode, learn how Atlas Search maps your documents to its index, and discover the configuration control you have. | md | {
"tags": [
"Atlas"
],
"pageDescription": "The Atlas Search 'cene: Season 1",
"contentType": "Video"
} | The Atlas Search 'cene: Season 1 | 2024-05-20T17:32:23.500Z | [
-0.05399884656071663,
-0.0029614008963108063,
0.018389998003840446,
-0.02181326411664486,
0.02867722511291504,
0.01909821666777134,
0.014562747441232204,
0.025323189795017242,
0.004523050971329212,
-0.0012164791114628315,
0.0032280415762215853,
-0.015654992312192917,
0.02656267024576664,
0... |
devcenter | https://www.mongodb.com/developer/products/atlas/atlas-search-cene-1 | created | ### **[Episode 3: Indexing][4]**
While Atlas Search automatically indexes your collections content, it does demand attention to the indexing configuration details in order to match users queries appropriately. This episode covers how Atlas Search builds an inverted index, and the options one must consider.
### **[Episode 4: Searching][5]**
Atlas Search provides a rich set of query operators and relevancy controls. This episode covers the common query operators, their relevancy controls, and ends with coverage of the must-have Query Analytics feature.
### **[Episode 5: Faceting][6]**
Facets produce additional context for search results, providing a list of subsets and counts within. This episode details the faceting options available in Atlas Search.
### **[Episode 6: Advanced Search Topics][7]**
In this episode, we go through some more advanced search topics including embedded documents, fuzzy search, autocomplete, highlighting, and geospatial. | md | {
"tags": [
"Atlas"
],
"pageDescription": "The Atlas Search 'cene: Season 1",
"contentType": "Video"
} | The Atlas Search 'cene: Season 1 | 2024-05-20T17:32:23.500Z | [
-0.051897041499614716,
-0.005160131957381964,
0.02444884181022644,
0.020137501880526543,
0.04741182178258896,
0.007022537756711245,
0.025010867044329643,
0.010959460400044918,
-0.02913862094283104,
-0.0008940236293710768,
0.025229420512914658,
-0.026444939896464348,
-0.004209971986711025,
... |
devcenter | https://www.mongodb.com/developer/products/atlas/atlas-search-cene-1 | created | In this episode, we go through some more advanced search topics including embedded documents, fuzzy search, autocomplete, highlighting, and geospatial.
### **[Episode 7: Query Analytics][8]**
Are your users finding what they are looking for? Are your top queries returning the best results? This episode covers the important topic of query analytics. If you're using search, you need this!
### **[Episode 8: Tips & Tricks][9]**
In this final episode of The Atlas Search 'cene Season 1, useful techniques to introspect query details and see the relevancy scoring computation details. Also shown is how to get facets and search results back in one API call. | md | {
"tags": [
"Atlas"
],
"pageDescription": "The Atlas Search 'cene: Season 1",
"contentType": "Video"
} | The Atlas Search 'cene: Season 1 | 2024-05-20T17:32:23.500Z | [
-0.06676021963357925,
-0.004111364483833313,
0.004736207891255617,
-0.013263901695609093,
0.030556093901395798,
0.01958758383989334,
0.03644321858882904,
0.0224448349326849,
0.006258488167077303,
-0.0028273065108805895,
-0.005389469675719738,
-0.024699581786990166,
0.014987430535256863,
0.... |
devcenter | https://www.mongodb.com/developer/products/atlas/atlas-search-cene-1 | created | [1]: https://www.mongodb.com/atlas/search
[2]: https://www.mongodb.com/developer/videos/what-is-atlas-search-quick-start/
[3]: https://www.mongodb.com/developer/videos/atlas-search-configuration-development-environment/
[4]: https://www.mongodb.com/developer/videos/mastering-indexing-for-perfect-query-matches/
[5]: https://www.mongodb.com/developer/videos/query-operators-relevancy-controls-for-precision-searches/
[6]: https://www.mongodb.com/developer/videos/faceting-mastery-unlock-the-full-potential-of-atlas-search-s-contextual-insights/
[7]: https://www.mongodb.com/developer/videos/atlas-search-mastery-elevate-your-search-with-fuzzy-geospatial-highlighting-hacks/
[8]: https://www.mongodb.com/developer/videos/atlas-search-query-analytics/ | md | {
"tags": [
"Atlas"
],
"pageDescription": "The Atlas Search 'cene: Season 1",
"contentType": "Video"
} | The Atlas Search 'cene: Season 1 | 2024-05-20T17:32:23.500Z | [
-0.0772363692522049,
-0.016209488734602928,
0.023629581555724144,
0.005377569235861301,
0.05635572597384453,
-0.006048074923455715,
0.0033894714433699846,
-0.0009399149566888809,
-0.010975291021168232,
-0.015849722549319267,
0.010561182163655758,
-0.057873550802469254,
-0.006855860818177462,... |
devcenter | https://www.mongodb.com/developer/products/atlas/atlas-search-cene-1 | created | [8]: https://www.mongodb.com/developer/videos/atlas-search-query-analytics/
[9]: https://www.mongodb.com/developer/videos/tips-and-tricks-the-atlas-search-cene-season-1-episode-8/ | md | {
"tags": [
"Atlas"
],
"pageDescription": "The Atlas Search 'cene: Season 1",
"contentType": "Video"
} | The Atlas Search 'cene: Season 1 | 2024-05-20T17:32:23.500Z | [
-0.06959200650453568,
-0.017777325585484505,
0.003730013268068433,
-0.018731290474534035,
0.05623701959848404,
0.010634814389050007,
0.007514973636716604,
0.02913835644721985,
0.00411106925457716,
-0.008122571744024754,
0.008557385765016079,
-0.049779560416936874,
0.01478045154362917,
0.02... |
devcenter | https://www.mongodb.com/developer/products/mongodb/atlas-open-ai-review-summary | created | # Using MongoDB Atlas Triggers to Summarize Airbnb Reviews with OpenAI
In the realm of property rentals, reviews play a pivotal role. MongoDB Atlas triggers, combined with the power of OpenAI's models, can help summarize and analyze these reviews in real-time. In this article, we'll explore how to utilize MongoDB Atlas triggers to process Airbnb reviews, yielding concise summaries and relevant tags.
This article is an additional feature added to the hotels and apartment sentiment search application developed in Leveraging OpenAI and MongoDB Atlas for Improved Search Functionality.
## Introduction
MongoDB Atlas triggers allow users to define functions that execute in real-time in response to database operations. These triggers can be harnessed to enhance data processing and analysis capabilities. In this example, we aim to generate summarized reviews and tags for a sample Airbnb dataset.
Our original data model has each review embedded in the listing document as an array: | md | {
"tags": [
"MongoDB",
"JavaScript",
"AI",
"Node.js"
],
"pageDescription": "Uncover the synergy of MongoDB Atlas triggers and OpenAI models in real-time analysis and summarization of Airbnb reviews. ",
"contentType": "Tutorial"
} | Using MongoDB Atlas Triggers to Summarize Airbnb Reviews with OpenAI | 2024-05-20T17:32:23.500Z | [
-0.02477567084133625,
-0.004139764234423637,
0.0033128077629953623,
0.009702196344733238,
0.046617407351732254,
0.02175799570977688,
0.023161737248301506,
0.03339182958006859,
0.003896396839991212,
-0.019296376034617424,
-0.010015616193413734,
-0.00755298463627696,
0.02327808551490307,
0.0... |
devcenter | https://www.mongodb.com/developer/products/mongodb/atlas-open-ai-review-summary | created | ```javascript
"reviews": { "_id": "2663437",
"date": { "$date": "2012-10-20T04:00:00.000Z" }, \
"listing_id": "664017",
"reviewer_id": "633940",
"reviewer_name": "Patricia",
"comments": "I booked the room at Marinete's apartment for my husband. He was staying in Rio for a week because he was studying Portuguese. He loved the place. Marinete was very helpfull, the room was nice and clean. \r\nThe location is perfect. He loved the time there. \r\n\r\n" },
{ "_id": "2741592",
"date": { "$date": "2012-10-28T04:00:00.000Z" },
"listing_id": "664017", | md | {
"tags": [
"MongoDB",
"JavaScript",
"AI",
"Node.js"
],
"pageDescription": "Uncover the synergy of MongoDB Atlas triggers and OpenAI models in real-time analysis and summarization of Airbnb reviews. ",
"contentType": "Tutorial"
} | Using MongoDB Atlas Triggers to Summarize Airbnb Reviews with OpenAI | 2024-05-20T17:32:23.500Z | [
-0.047046247869729996,
0.013360108248889446,
0.03314833715558052,
-0.026793386787176132,
0.01846095733344555,
-0.014731922186911106,
0.038464728742837906,
0.050045598298311234,
0.0033412629272788763,
-0.008582023903727531,
-0.004438611213117838,
-0.01506042666733265,
-0.01605268009006977,
... |
devcenter | https://www.mongodb.com/developer/products/mongodb/atlas-open-ai-review-summary | created | "listing_id": "664017",
"reviewer_id": "3932440",
"reviewer_name": "Carolina",
"comments": "Es una muy buena anfitriona, preocupada de que te encuentres cómoda y te sugiere que actividades puedes realizar. Disfruté mucho la estancia durante esos días, el sector es central y seguro." }, ... ]
``` | md | {
"tags": [
"MongoDB",
"JavaScript",
"AI",
"Node.js"
],
"pageDescription": "Uncover the synergy of MongoDB Atlas triggers and OpenAI models in real-time analysis and summarization of Airbnb reviews. ",
"contentType": "Tutorial"
} | Using MongoDB Atlas Triggers to Summarize Airbnb Reviews with OpenAI | 2024-05-20T17:32:23.500Z | [
-0.030334945768117905,
-0.009156111627817154,
0.06746015697717667,
0.008061906322836876,
-0.0019449418177828193,
0.013199279084801674,
0.02676151879131794,
0.06135082244873047,
-0.023230483755469322,
0.005505756940692663,
-0.012951171956956387,
-0.058555614203214645,
0.013168364763259888,
... |
devcenter | https://www.mongodb.com/developer/products/mongodb/atlas-open-ai-review-summary | created | ## Prerequisites
- App Services application (e.g., application-0). Ensure linkage to the cluster with the Airbnb data.
- OpenAI account with API access.
![Open AI Key
### Secrets and Values
1. Navigate to your App Services application.
2. Under "Values," create a secret named `openAIKey` with your OPEN AI API key.
3. Create a linked value named OpenAIKey and link to the secret.
## The trigger code
The provided trigger listens for changes in the sample_airbnb.listingsAndReviews collection. Upon detecting a new review, it samples up to 50 reviews, sends them to OpenAI's API for summarization, and updates the original document with the summarized content and tags.
Please notice that the trigger reacts to updates that were marked with `"process" : false` flag. This field indicates that there were no summary created for this batch of reviews yet. | md | {
"tags": [
"MongoDB",
"JavaScript",
"AI",
"Node.js"
],
"pageDescription": "Uncover the synergy of MongoDB Atlas triggers and OpenAI models in real-time analysis and summarization of Airbnb reviews. ",
"contentType": "Tutorial"
} | Using MongoDB Atlas Triggers to Summarize Airbnb Reviews with OpenAI | 2024-05-20T17:32:23.500Z | [
-0.01802649348974228,
-0.03896863013505936,
-0.004243589472025633,
-0.03542868047952652,
0.025339286774396896,
0.05924767628312111,
0.02433825097978115,
0.028019333258271217,
0.011712748557329178,
-0.0223247017711401,
-0.011218799278140068,
-0.0760888084769249,
0.009846837259829044,
0.0867... |
devcenter | https://www.mongodb.com/developer/products/mongodb/atlas-open-ai-review-summary | created | Example of a review update operation that will fire this trigger:
```javascript
listingsAndReviews.updateOne({"_id" : "1129303"}, { $push : { "reviews" : new_review } , $set : { "process" : false" }});
```
### Sample reviews function
To prevent overloading the API with a large number of reviews, a function sampleReviews is defined to randomly sample up to 50 reviews:
```javscript
function sampleReviews(reviews) {
if (reviews.length <= 50) {
return reviews;
}
const sampledReviews = ];
const seenIndices = new Set();
while (sampledReviews.length < 50) {
const randomIndex = Math.floor(Math.random() * reviews.length);
if (!seenIndices.has(randomIndex)) {
seenIndices.add(randomIndex);
sampledReviews.push(reviews[randomIndex]);
}
}
return sampledReviews;
}
``` | md | {
"tags": [
"MongoDB",
"JavaScript",
"AI",
"Node.js"
],
"pageDescription": "Uncover the synergy of MongoDB Atlas triggers and OpenAI models in real-time analysis and summarization of Airbnb reviews. ",
"contentType": "Tutorial"
} | Using MongoDB Atlas Triggers to Summarize Airbnb Reviews with OpenAI | 2024-05-20T17:32:23.500Z | [
-0.09217647463083267,
-0.021917041391134262,
0.02973153442144394,
-0.03574100881814957,
0.0037330961786210537,
0.045699745416641235,
0.02375026047229767,
0.04615376517176628,
0.03743452578783035,
-0.02945530228316784,
-0.020444029942154884,
0.004793725907802582,
-0.00028507609385997057,
0.... |
devcenter | https://www.mongodb.com/developer/products/mongodb/atlas-open-ai-review-summary | created | return sampledReviews;
}
```
### Main trigger logic
The main trigger logic is invoked when an update change event is detected with a `"process" : false` field.
```javascript
exports = async function(changeEvent) {
// A Database Trigger will always call a function with a changeEvent.
// Documentation on ChangeEvents: https://www.mongodb.com/docs/manual/reference/change-events
// This sample function will listen for events and replicate them to a collection in a different Database
function sampleReviews(reviews) {
// Logic above...
if (reviews.length <= 50) {
return reviews;
}
const sampledReviews = [];
const seenIndices = new Set();
while (sampledReviews.length < 50) {
const randomIndex = Math.floor(Math.random() * reviews.length);
if (!seenIndices.has(randomIndex)) {
seenIndices.add(randomIndex);
sampledReviews.push(reviews[randomIndex]);
}
} | md | {
"tags": [
"MongoDB",
"JavaScript",
"AI",
"Node.js"
],
"pageDescription": "Uncover the synergy of MongoDB Atlas triggers and OpenAI models in real-time analysis and summarization of Airbnb reviews. ",
"contentType": "Tutorial"
} | Using MongoDB Atlas Triggers to Summarize Airbnb Reviews with OpenAI | 2024-05-20T17:32:23.500Z | [
-0.07496263831853867,
-0.0022931282874196768,
0.015449314378201962,
-0.032833877950906754,
0.0274550449103117,
0.023118797689676285,
0.040175266563892365,
0.06100309640169144,
0.019930480048060417,
-0.014482250437140465,
-0.013135907240211964,
-0.04342693090438843,
0.0036264185328036547,
0... |
devcenter | https://www.mongodb.com/developer/products/mongodb/atlas-open-ai-review-summary | created | return sampledReviews;
}
// Access the _id of the changed document:
const docId = changeEvent.documentKey._id;
const doc= changeEvent.fullDocument;
// Get the MongoDB service you want to use (see "Linked Data Sources" tab)
const serviceName = "mongodb-atlas";
const databaseName = "sample_airbnb";
const collection = context.services.get(serviceName).db(databaseName).collection(changeEvent.ns.coll);
// This function is the endpoint's request handler.
// URL to make the request to the OpenAI API.
const url = 'https://api.openai.com/v1/chat/completions';
// Fetch the OpenAI key stored in the context values.
const openai_key = context.values.get("openAIKey"); | md | {
"tags": [
"MongoDB",
"JavaScript",
"AI",
"Node.js"
],
"pageDescription": "Uncover the synergy of MongoDB Atlas triggers and OpenAI models in real-time analysis and summarization of Airbnb reviews. ",
"contentType": "Tutorial"
} | Using MongoDB Atlas Triggers to Summarize Airbnb Reviews with OpenAI | 2024-05-20T17:32:23.500Z | [
-0.031809691339731216,
-0.01273814681917429,
0.015823574736714363,
-0.00680987723171711,
0.050974052399396896,
0.03372253477573395,
0.02040598727762699,
0.06981812417507172,
-0.011519516818225384,
0.000307806592900306,
-0.02120952121913433,
-0.056947194039821625,
-0.012161685153841972,
0.0... |
devcenter | https://www.mongodb.com/developer/products/mongodb/atlas-open-ai-review-summary | created | // Fetch the OpenAI key stored in the context values.
const openai_key = context.values.get("openAIKey");
const reviews = doc.reviews.map((review) => {return {"comments" : review.comments}});
const sampledReviews= sampleReviews(reviews);
// Prepare the request string for the OpenAI API.
const reqString = `Summerize the reviews provided here: ${JSON.stringify(sampledReviews)} | instructions example:\n\n [{"comment" : "Very Good bed"} ,{"comment" : "Very bad smell"} ] \nOutput: {"overall_review": "Overall good beds and bad smell" , "neg_tags" : ["bad smell"], pos_tags : ["good bed"]}. No explanation. No 'Output:' string in response. Valid JSON. `;
console.log(`reqString: ${reqString}`); | md | {
"tags": [
"MongoDB",
"JavaScript",
"AI",
"Node.js"
],
"pageDescription": "Uncover the synergy of MongoDB Atlas triggers and OpenAI models in real-time analysis and summarization of Airbnb reviews. ",
"contentType": "Tutorial"
} | Using MongoDB Atlas Triggers to Summarize Airbnb Reviews with OpenAI | 2024-05-20T17:32:23.500Z | [
-0.035808634012937546,
0.0012021007714793086,
0.008661667816340923,
-0.05116145312786102,
0.04451536014676094,
-0.016249194741249084,
0.04040682688355446,
0.04493623226881027,
0.0037132801953703165,
-0.017337320372462273,
-0.04771295562386513,
-0.050985898822546005,
0.03205788880586624,
0.... |
devcenter | https://www.mongodb.com/developer/products/mongodb/atlas-open-ai-review-summary | created | // Call OpenAI API to get the response.
let resp = await context.http.post({
url: url,
headers: {
'Authorization': [`Bearer ${openai_key}`],
'Content-Type': ['application/json']
},
body: JSON.stringify({
model: "gpt-4",
temperature: 0,
messages: [
{
"role": "system",
"content": "Output json generator follow only provided example on the current reviews"
},
{
"role": "user",
"content": reqString
}
]
})
});
// Parse the JSON response
let responseData = JSON.parse(resp.body.text());
// Check the response status.
if(resp.statusCode === 200) {
console.log("Successfully received code.");
console.log(JSON.stringify(responseData)); | md | {
"tags": [
"MongoDB",
"JavaScript",
"AI",
"Node.js"
],
"pageDescription": "Uncover the synergy of MongoDB Atlas triggers and OpenAI models in real-time analysis and summarization of Airbnb reviews. ",
"contentType": "Tutorial"
} | Using MongoDB Atlas Triggers to Summarize Airbnb Reviews with OpenAI | 2024-05-20T17:32:23.500Z | [
-0.029451776295900345,
0.01280142180621624,
0.010928797535598278,
-0.06095277518033981,
0.03169967979192734,
0.0031140462961047888,
0.05159224569797516,
0.04979121312499046,
0.022862166166305542,
0.006799749564379454,
-0.01923985779285431,
-0.053314026445150375,
0.02966015599668026,
0.0834... |
devcenter | https://www.mongodb.com/developer/products/mongodb/atlas-open-ai-review-summary | created | const code = responseData.choices[0].message.content;
// Get the required data to be added into the document
const updateDoc = JSON.parse(code)
// Set a flag that this document does not need further re-processing
updateDoc.process = true
await collection.updateOne({_id : docId}, {$set : updateDoc});
} else {
console.error("Failed to generate filter JSON.");
console.log(JSON.stringify(responseData));
return {};
}
};
```
Key steps include: | md | {
"tags": [
"MongoDB",
"JavaScript",
"AI",
"Node.js"
],
"pageDescription": "Uncover the synergy of MongoDB Atlas triggers and OpenAI models in real-time analysis and summarization of Airbnb reviews. ",
"contentType": "Tutorial"
} | Using MongoDB Atlas Triggers to Summarize Airbnb Reviews with OpenAI | 2024-05-20T17:32:23.500Z | [
-0.08430814743041992,
0.01045955065637827,
0.03579709306359291,
-0.03414199873805046,
0.02627354860305786,
0.021485809236764908,
0.008494805544614792,
0.06600920855998993,
-0.008102654479444027,
0.003950087353587151,
-0.019482463598251343,
-0.013794470578432083,
0.023629071190953255,
0.022... |
devcenter | https://www.mongodb.com/developer/products/mongodb/atlas-open-ai-review-summary | created | Key steps include:
- API request preparation: Reviews from the changed document are sampled and prepared into a request string for the OpenAI API. The format and instructions are tailored to ensure the API returns a valid JSON with summarized content and tags.
- API interaction: Using the context.http.post method, the trigger sends the prepared data to the OpenAI API.
- Updating the original document: Upon a successful response from the API, the trigger updates the original document with the summarized content, negative tags (neg_tags), positive tags (pos_tags), and a process flag set to true. | md | {
"tags": [
"MongoDB",
"JavaScript",
"AI",
"Node.js"
],
"pageDescription": "Uncover the synergy of MongoDB Atlas triggers and OpenAI models in real-time analysis and summarization of Airbnb reviews. ",
"contentType": "Tutorial"
} | Using MongoDB Atlas Triggers to Summarize Airbnb Reviews with OpenAI | 2024-05-20T17:32:23.500Z | [
-0.06425446271896362,
0.015196016989648342,
0.014957580715417862,
-0.04905816540122032,
0.04558635503053665,
0.026605455204844475,
0.02458653226494789,
0.042485110461711884,
0.0024984688498079777,
0.00038978096563369036,
0.0014386388938874006,
-0.04643610119819641,
0.013037458062171936,
0.... |
devcenter | https://www.mongodb.com/developer/products/mongodb/atlas-open-ai-review-summary | created | Here is a sample result that is added to the processed listing document:
```
"process": true,
"overall_review": "Overall, guests had a positive experience at Marinete's apartment. They praised the location, cleanliness, and hospitality. However, some guests mentioned issues with the dog and language barrier.",
"neg_tags": [ "language barrier", "dog issues" ],
"pos_tags": [ "great location", "cleanliness", "hospitality" ]
```
Once the data is added to our documents, providing this information in our VUE application is as simple as adding this HTML template:
```html
Overall Review (ai based) : {{ listing.overall_review }}
{{tag}}
{{tag}}
``` | md | {
"tags": [
"MongoDB",
"JavaScript",
"AI",
"Node.js"
],
"pageDescription": "Uncover the synergy of MongoDB Atlas triggers and OpenAI models in real-time analysis and summarization of Airbnb reviews. ",
"contentType": "Tutorial"
} | Using MongoDB Atlas Triggers to Summarize Airbnb Reviews with OpenAI | 2024-05-20T17:32:23.500Z | [
-0.018523477017879486,
0.013775629922747612,
0.018246788531541824,
-0.037361256778240204,
0.060052309185266495,
0.015071781352162361,
0.022544700652360916,
0.05805240944027901,
0.02249765209853649,
0.003295812988653779,
-0.00004248873301548883,
-0.03664785996079445,
0.0252126082777977,
0.0... |
devcenter | https://www.mongodb.com/developer/products/mongodb/atlas-open-ai-review-summary | created | Overall Review (ai based) : {{ listing.overall_review }}
{{tag}}
{{tag}}
```
## Conclusion
By integrating MongoDB Atlas triggers with OpenAI's powerful models, we can efficiently process and analyze large volumes of reviews in real-time. This setup not only provides concise summaries of reviews but also categorizes them into positive and negative tags, offering valuable insights to property hosts and potential renters.
Questions? Comments? Let’s continue the conversation over in our [community forums. | md | {
"tags": [
"MongoDB",
"JavaScript",
"AI",
"Node.js"
],
"pageDescription": "Uncover the synergy of MongoDB Atlas triggers and OpenAI models in real-time analysis and summarization of Airbnb reviews. ",
"contentType": "Tutorial"
} | Using MongoDB Atlas Triggers to Summarize Airbnb Reviews with OpenAI | 2024-05-20T17:32:23.500Z | [
-0.04842721298336983,
-0.025147899985313416,
0.01631397195160389,
-0.019730543717741966,
0.03751155734062195,
0.006439289078116417,
0.031063158065080643,
0.06317470222711563,
-0.012957305647432804,
0.01124939601868391,
-0.008856471627950668,
-0.001972187776118517,
0.00006470937660196796,
0... |
devcenter | https://www.mongodb.com/developer/products/mongodb/getting-started-with-mongodb-and-codewhisperer | created | # Getting Started with MongoDB and AWS Codewhisperer
**Introduction**
----------------
Amazon CodeWhisperer is trained on billions of lines of code and can generate code suggestions — ranging from snippets to full functions — in real-time, based on your comments and existing code. AI code assistants have revolutionized developers’ coding experience, but what sets Amazon CodeWhisperer apart is that MongoDB has collaborated with the AWS Data Science team, enhancing its capabilities!
At MongoDB, we are always looking to enhance the developer experience, and we've fine-tuned the CodeWhisperer Foundational Models to deliver top-notch code suggestions — trained on, and tailored for, MongoDB. This gives developers of all levels the best possible experience when using CodeWhisperer for MongoDB functions. | md | {
"tags": [
"MongoDB",
"JavaScript",
"Java",
"Python",
"AWS",
"AI"
],
"pageDescription": "",
"contentType": "Tutorial"
} | Getting Started with MongoDB and AWS Codewhisperer | 2024-05-20T17:32:23.500Z | [
-0.06456639617681503,
-0.02141782082617283,
0.006294905208051205,
-0.021012524142861366,
0.04662040248513222,
-0.0047924211248755455,
-0.004582950379699469,
0.02320190891623497,
-0.01318175345659256,
0.016072852537035942,
0.006855227518826723,
-0.038180913776159286,
0.020347001031041145,
0... |
devcenter | https://www.mongodb.com/developer/products/mongodb/getting-started-with-mongodb-and-codewhisperer | created | This tutorial will help you get CodeWhisperer up and running in VS Code, but CodeWhisperer also works with a number of other IDEs, including IntelliJ IDEA, AWS Cloud9, AWS Lambda console, JupyterLab, and Amazon SageMaker Studio. On the [Amazon CodeWhisperer site][1], you can find tutorials that demonstrate how to set up CodeWhisperer on different IDEs, as well as other documentation.
*Note:* CodeWhisperer allows users to start without an AWS account because usually, creating an AWS account requires a credit card. Currently, CodeWhisperer is free for individual users. So it’s super easy to get up and running.
**Installing CodeWhisperer for VS Code**
CodeWhisperer doesn’t have its own VS Code extension. It is part of a larger extension for AWS services called AWS Toolkit. AWS Toolkit is available in the VS Code extensions store. | md | {
"tags": [
"MongoDB",
"JavaScript",
"Java",
"Python",
"AWS",
"AI"
],
"pageDescription": "",
"contentType": "Tutorial"
} | Getting Started with MongoDB and AWS Codewhisperer | 2024-05-20T17:32:23.500Z | [
-0.05912330374121666,
-0.04956725239753723,
-0.010123714804649353,
-0.0076818810775876045,
0.027913479134440422,
0.0193786658346653,
-0.010468638502061367,
0.02033318765461445,
-0.01157626323401928,
-0.004519777838140726,
0.02005293220281601,
-0.03447508439421654,
0.04998226463794708,
0.02... |
devcenter | https://www.mongodb.com/developer/products/mongodb/getting-started-with-mongodb-and-codewhisperer | created | 1. Open VS Code and navigate to the extensions store (bottom icon on the left panel).
2. Search for CodeWhisperer and it will show up as part of the AWS Toolkit.
![Searching for the AWS ToolKit Extension][2]
3. Once found, hit Install. Next, you’ll see the full AWS Toolkit
Listing
![The AWS Toolkit full listing][3]
4. Once installed, you’ll need to authorize CodeWhisperer via a Builder
ID to connect to your AWS developer account (or set up a new account
if you don’t already have one).
![Authorise CodeWhisperer][4]
**Using CodeWhisperer**
-----------------------
Navigating code suggestions
![CodeWhisperer Running][5] | md | {
"tags": [
"MongoDB",
"JavaScript",
"Java",
"Python",
"AWS",
"AI"
],
"pageDescription": "",
"contentType": "Tutorial"
} | Getting Started with MongoDB and AWS Codewhisperer | 2024-05-20T17:32:23.500Z | [
-0.05871956795454025,
-0.01820199191570282,
0.02271755412220955,
0.0016068365657702088,
0.03996641933917999,
0.03970911726355553,
-0.014900146052241325,
0.007821847684681416,
-0.019832387566566467,
0.009616336785256863,
0.01070330385118723,
-0.04804794862866402,
0.03952047973871231,
0.0033... |
devcenter | https://www.mongodb.com/developer/products/mongodb/getting-started-with-mongodb-and-codewhisperer | created | **Using CodeWhisperer**
-----------------------
Navigating code suggestions
![CodeWhisperer Running][5]
With CodeWhisperer installed and running, as you enter your prompt or code, CodeWhisperer will offer inline code suggestions. If you want to keep the suggestion, use **TAB** to accept it. CodeWhisperer may provide multiple suggestions to choose from depending on your use case. To navigate between suggestions, use the left and right arrow keys to view them, and **TAB** to accept.
If you don’t like the suggestions you see, keep typing (or hit **ESC**). The suggestions will disappear, and CodeWhisperer will generate new ones at a later point based on the additional context.
**Requesting suggestions manually** | md | {
"tags": [
"MongoDB",
"JavaScript",
"Java",
"Python",
"AWS",
"AI"
],
"pageDescription": "",
"contentType": "Tutorial"
} | Getting Started with MongoDB and AWS Codewhisperer | 2024-05-20T17:32:23.500Z | [
-0.07219529896974564,
-0.007200576364994049,
-0.0023679619189351797,
-0.04683566093444824,
-0.007351596374064684,
0.04573453217744827,
0.0234117042273283,
0.047622475773096085,
0.004880059976130724,
-0.01350026112049818,
0.0017455840716138482,
-0.01147446222603321,
0.003620678326115012,
0.... |
devcenter | https://www.mongodb.com/developer/products/mongodb/getting-started-with-mongodb-and-codewhisperer | created | **Requesting suggestions manually**
You can request suggestions at any time. Use **Option-C** on Mac or **ALT-C** on Windows. After you receive suggestions, use **TAB** to accept and arrow keys to navigate.
**Getting the best recommendations**
For best results, follow these practices. | md | {
"tags": [
"MongoDB",
"JavaScript",
"Java",
"Python",
"AWS",
"AI"
],
"pageDescription": "",
"contentType": "Tutorial"
} | Getting Started with MongoDB and AWS Codewhisperer | 2024-05-20T17:32:23.500Z | [
-0.037793900817632675,
0.009321185760200024,
0.018807388842105865,
-0.022428542375564575,
-0.023728396743535995,
0.04572127014398575,
0.054113537073135376,
0.027041301131248474,
0.012802472338080406,
-0.03669269382953644,
0.012890598736703396,
-0.03439677134156227,
-0.017111748456954956,
0... |
devcenter | https://www.mongodb.com/developer/products/mongodb/getting-started-with-mongodb-and-codewhisperer | created | **Getting the best recommendations**
For best results, follow these practices.
- Give CodeWhisperer something to work with. The more code your file contains, the more context CodeWhisperer has for generating recommendations.
- Write descriptive comments in natural language — for example
```
// Take a JSON document as a String and store it in MongoDB returning the _id
```
Or
```
//Insert a document in a collection with a given _id and a discountLevel
```
- Specify the libraries you prefer at the start of your file by using import statements.
```
// This Java class works with MongoDB sync driver.
// This class implements Connection to MongoDB and CRUD methods.
```
- Use descriptive names for variables and functions
- Break down complex tasks into simpler tasks
**Provide feedback**
---------------- | md | {
"tags": [
"MongoDB",
"JavaScript",
"Java",
"Python",
"AWS",
"AI"
],
"pageDescription": "",
"contentType": "Tutorial"
} | Getting Started with MongoDB and AWS Codewhisperer | 2024-05-20T17:32:23.500Z | [
-0.055695611983537674,
-0.00236820918507874,
0.0050787474028766155,
-0.05617925524711609,
0.02910137176513672,
-0.0006241381051950157,
0.002846823073923588,
0.08091509342193604,
-0.049371395260095596,
-0.02040194533765316,
0.001094409148208797,
-0.013390183448791504,
0.007512558717280626,
... |
devcenter | https://www.mongodb.com/developer/products/mongodb/getting-started-with-mongodb-and-codewhisperer | created | **Provide feedback**
----------------
As with all generative AI tools, they are forever learning and forever expanding their foundational knowledge base, and MongoDB is looking for feedback. If you are using Amazon CodeWhisperer in your MongoDB development, we’d love to hear from you.
We’ve created a special “codewhisperer” tag on our [Developer Forums][6], and if you tag any post with this, it will be visible to our CodeWhisperer project team and we will get right on it to help and provide feedback. If you want to see what others are doing with CodeWhisperer on our forums, the [tag search link][7] will jump you straight into all the action.
We can’t wait to see your thoughts and impressions of MongoDB and Amazon CodeWhisperer together. | md | {
"tags": [
"MongoDB",
"JavaScript",
"Java",
"Python",
"AWS",
"AI"
],
"pageDescription": "",
"contentType": "Tutorial"
} | Getting Started with MongoDB and AWS Codewhisperer | 2024-05-20T17:32:23.500Z | [
-0.0754004493355751,
-0.030515208840370178,
0.0367886982858181,
-0.012632420286536217,
0.01949852705001831,
-0.024691054597496986,
-0.026394423097372055,
0.0338144414126873,
-0.04351126775145531,
0.01708023063838482,
0.016154516488313675,
-0.04983565956354141,
0.021673787385225296,
0.06961... |
devcenter | https://www.mongodb.com/developer/products/mongodb/getting-started-with-mongodb-and-codewhisperer | created | [1]: https://aws.amazon.com/codewhisperer/resources/#Getting_started
[2]: https://images.contentstack.io/v3/assets/blt39790b633ee0d5a7/blt1bfd28a846063ae9/65481ef6e965d6040a3dcc37/CW_1.png
[3]: https://images.contentstack.io/v3/assets/blt39790b633ee0d5a7/bltde40d5ae1b9dd8dd/65481ef615630d040a4b2588/CW_2.png
[4]: https://images.contentstack.io/v3/assets/blt39790b633ee0d5a7/blt636bb8d307bebcee/65481ef6a6e009040a740b86/CW_3.png | md | {
"tags": [
"MongoDB",
"JavaScript",
"Java",
"Python",
"AWS",
"AI"
],
"pageDescription": "",
"contentType": "Tutorial"
} | Getting Started with MongoDB and AWS Codewhisperer | 2024-05-20T17:32:23.500Z | [
-0.04668710008263588,
-0.05350756272673607,
0.03410687297582626,
-0.0008209437946788967,
0.03904532641172409,
0.05028768628835678,
-0.036285627633333206,
0.0371750183403492,
-0.028346305713057518,
0.007081599440425634,
0.035153865814208984,
-0.06255291402339935,
0.03658417984843254,
0.0336... |
devcenter | https://www.mongodb.com/developer/products/mongodb/getting-started-with-mongodb-and-codewhisperer | created | [5]: https://images.contentstack.io/v3/assets/blt39790b633ee0d5a7/bltf1e0ebeea2089e6a/65481ef6077aca040a5349da/CW_4.png
[6]: https://www.mongodb.com/community/forums/
[7]: https://www.mongodb.com/community/forums/tag/codewhisperer | md | {
"tags": [
"MongoDB",
"JavaScript",
"Java",
"Python",
"AWS",
"AI"
],
"pageDescription": "",
"contentType": "Tutorial"
} | Getting Started with MongoDB and AWS Codewhisperer | 2024-05-20T17:32:23.500Z | [
-0.062355611473321915,
-0.02996961958706379,
0.00748904375359416,
-0.027389757335186005,
0.03996922820806503,
-0.010491594672203064,
-0.014069334603846073,
0.03178500384092331,
-0.03944150358438492,
0.009434203617274761,
0.0149646932259202,
-0.048328813165426254,
0.015768952667713165,
0.05... |
devcenter | https://www.mongodb.com/developer/code-examples/java/rest-apis-java-spring-boot | created | # REST APIs with Java, Spring Boot, and MongoDB
## GitHub repository
If you want to write REST APIs in Java at the speed of light, I have what you need. I wrote this template to get you started. I have tried to solve as many problems as possible in it.
So if you want to start writing REST APIs in Java, clone this project, and you will be up to speed in no time.
```shell
git clone https://github.com/mongodb-developer/java-spring-boot-mongodb-starter
```
That’s all folks! All you need is in this repository. Below I will explain a few of the features and details about this template, but feel free to skip what is not necessary for your understanding.
## README
All the extra information and commands you need to get this project going are in the `README.md` file which you can read in GitHub.
## Spring and MongoDB configuration | md | {
"tags": [
"Java",
"Spring"
],
"pageDescription": "Take a shortcut to REST APIs with this Java/Spring Boot and MongoDB example application that embeds all you'll need to get going.",
"contentType": "Code Example"
} | REST APIs with Java, Spring Boot, and MongoDB | 2024-05-20T17:32:23.500Z | [
-0.056110650300979614,
0.020723596215248108,
-0.0050460901111364365,
-0.04011356085538864,
0.03915746137499809,
-0.02171265333890915,
-0.023249952122569084,
0.043229542672634125,
-0.022295985370874405,
-0.02515980415046215,
-0.016533667221665382,
-0.05829558148980141,
0.04635586589574814,
... |
devcenter | https://www.mongodb.com/developer/code-examples/java/rest-apis-java-spring-boot | created | ## Spring and MongoDB configuration
The configuration can be found in the MongoDBConfiguration.java class.
```java
package com.mongodb.starter;
import ...]
import static org.bson.codecs.configuration.CodecRegistries.fromProviders;
import static org.bson.codecs.configuration.CodecRegistries.fromRegistries;
@Configuration
public class MongoDBConfiguration {
@Value("${spring.data.mongodb.uri}")
private String connectionString;
@Bean
public MongoClient mongoClient() {
CodecRegistry pojoCodecRegistry = fromProviders(PojoCodecProvider.builder().automatic(true).build());
CodecRegistry codecRegistry = fromRegistries(MongoClientSettings.getDefaultCodecRegistry(), pojoCodecRegistry);
return MongoClients.create(MongoClientSettings.builder()
.applyConnectionString(new ConnectionString(connectionString))
.codecRegistry(codecRegistry)
.build());
}
}
``` | md | {
"tags": [
"Java",
"Spring"
],
"pageDescription": "Take a shortcut to REST APIs with this Java/Spring Boot and MongoDB example application that embeds all you'll need to get going.",
"contentType": "Code Example"
} | REST APIs with Java, Spring Boot, and MongoDB | 2024-05-20T17:32:23.500Z | [
-0.022041186690330505,
0.012386796064674854,
0.04889116808772087,
-0.04572535306215286,
0.022289298474788666,
-0.004703840706497431,
0.03612551838159561,
0.06614290177822113,
-0.007171238772571087,
-0.02755134366452694,
-0.01405954547226429,
-0.058601442724466324,
0.0952560231089592,
0.049... |
devcenter | https://www.mongodb.com/developer/code-examples/java/rest-apis-java-spring-boot | created | }
```
The important section here is the MongoDB configuration, of course. Firstly, you will notice the connection string is automatically retrieved from the `application.properties` file, and secondly, you will notice the configuration of the `MongoClient` bean.
A `Codec` is the interface that abstracts the processes of decoding a BSON value into a Java object and encoding a Java object into a BSON value.
A `CodecRegistry` contains a set of `Codec` instances that are accessed according to the Java classes that they encode from and decode to.
The MongoDB driver is capable of encoding and decoding BSON for us, so we do not have to take care of this anymore. All the configuration we need for this project to run is here and nowhere else.
You can read [the driver documentation if you want to know more about this topic.
## Multi-document ACID transactions | md | {
"tags": [
"Java",
"Spring"
],
"pageDescription": "Take a shortcut to REST APIs with this Java/Spring Boot and MongoDB example application that embeds all you'll need to get going.",
"contentType": "Code Example"
} | REST APIs with Java, Spring Boot, and MongoDB | 2024-05-20T17:32:23.500Z | [
-0.03185790032148361,
-0.0073826974257826805,
0.01151951216161251,
-0.036193642765283585,
0.03182685747742653,
-0.014662986621260643,
0.03447699174284935,
0.05345042794942856,
-0.01536285225301981,
-0.04201623424887657,
-0.018850605934858322,
-0.09083906561136246,
0.028942318633198738,
0.0... |
devcenter | https://www.mongodb.com/developer/code-examples/java/rest-apis-java-spring-boot | created | You can read [the driver documentation if you want to know more about this topic.
## Multi-document ACID transactions
Just for the sake of it, I also used multi-document ACID transactions in a few methods where it could potentially make sense to use ACID transactions. You can check all the code in the `MongoDBPersonRepository` class.
Here is an example:
```java
private static final TransactionOptions txnOptions = TransactionOptions.builder()
.readPreference(ReadPreference.primary())
.readConcern(ReadConcern.MAJORITY)
.writeConcern(WriteConcern.MAJORITY)
.build();
@Override
public List saveAll(List personEntities) {
try (ClientSession clientSession = client.startSession()) {
return clientSession.withTransaction(() -> {
personEntities.forEach(p -> p.setId(new ObjectId()));
personCollection.insertMany(clientSession, personEntities);
return personEntities;
}, txnOptions);
}
}
``` | md | {
"tags": [
"Java",
"Spring"
],
"pageDescription": "Take a shortcut to REST APIs with this Java/Spring Boot and MongoDB example application that embeds all you'll need to get going.",
"contentType": "Code Example"
} | REST APIs with Java, Spring Boot, and MongoDB | 2024-05-20T17:32:23.500Z | [
-0.05469315126538277,
-0.001743087312206626,
-0.00930569414049387,
-0.023819755762815475,
-0.03809059038758278,
-0.018179314211010933,
0.06407202035188675,
0.040189072489738464,
-0.010788281448185444,
-0.012967627495527267,
0.010226083919405937,
-0.04590384662151337,
-0.0003548367531038821,
... |
devcenter | https://www.mongodb.com/developer/code-examples/java/rest-apis-java-spring-boot | created | As you can see, I’m using an auto-closeable try-with-resources which will automatically close the client session at the end. This helps me to keep the code clean and simple.
Some of you may argue that it is actually too simple because transactions (and write operations, in general) can throw exceptions, and I’m not handling any of them here… You are absolutely right and this is an excellent transition to the next part of this article.
## Exception management
Transactions in MongoDB can raise exceptions for various reasons, and I don’t want to go into the details too much here, but since MongoDB 3.6, any write operation that fails can be automatically retried once. And the transactions are no different. See the documentation for retryWrites.
If retryable writes are disabled or if a write operation fails twice, then MongoDB will send a MongoException (extends RuntimeException) which should be handled properly. | md | {
"tags": [
"Java",
"Spring"
],
"pageDescription": "Take a shortcut to REST APIs with this Java/Spring Boot and MongoDB example application that embeds all you'll need to get going.",
"contentType": "Code Example"
} | REST APIs with Java, Spring Boot, and MongoDB | 2024-05-20T17:32:23.500Z | [
-0.06570691615343094,
-0.00028689877944998443,
0.05140234902501106,
-0.04029842093586922,
0.01408341247588396,
-0.039414696395397186,
0.020902082324028015,
0.04618111252784729,
-0.016687754541635513,
-0.05932946503162384,
-0.021208424121141434,
-0.03566288203001022,
-0.007504936773329973,
... |
devcenter | https://www.mongodb.com/developer/code-examples/java/rest-apis-java-spring-boot | created | Luckily, Spring provides the annotation `ExceptionHandler` to help us do that. See the code in my controller `PersonController`. Of course, you will need to adapt and enhance this in your real project, but you have the main idea here.
```java
@ExceptionHandler(RuntimeException.class)
public final ResponseEntity handleAllExceptions(RuntimeException e) {
logger.error("Internal server error.", e);
return new ResponseEntity<>(e, HttpStatus.INTERNAL_SERVER_ERROR);
}
```
## Aggregation pipeline
MongoDB's aggregation pipeline is a very powerful and efficient way to run your complex queries as close as possible to your data for maximum efficiency. Using it can ease the computational load on your application.
Just to give you a small example, I implemented the `/api/persons/averageAge` route to show you how I can retrieve the average age of the persons in my collection. | md | {
"tags": [
"Java",
"Spring"
],
"pageDescription": "Take a shortcut to REST APIs with this Java/Spring Boot and MongoDB example application that embeds all you'll need to get going.",
"contentType": "Code Example"
} | REST APIs with Java, Spring Boot, and MongoDB | 2024-05-20T17:32:23.500Z | [
-0.06039218604564667,
0.0036323911044746637,
0.03805512934923172,
-0.040028877556324005,
0.011331590823829174,
-0.0352645181119442,
0.05585698410868645,
0.054624419659376144,
-0.028550488874316216,
-0.017760273069143295,
0.002833765931427479,
-0.03987063094973564,
0.04272453114390373,
0.05... |
devcenter | https://www.mongodb.com/developer/code-examples/java/rest-apis-java-spring-boot | created | ```java
@Override
public double getAverageAge() {
List pipeline = List.of(group(new BsonNull(), avg("averageAge", "$age")), project(excludeId()));
return personCollection.aggregate(pipeline, AverageAgeDTO.class).first().averageAge();
}
```
Also, you can note here that I’m using the `personCollection` which was initially instantiated like this:
```java
private MongoCollection personCollection;
@PostConstruct
void init() {
personCollection = client.getDatabase("test").getCollection("persons", PersonEntity.class);
}
```
Normally, my personCollection should encode and decode `PersonEntity` object only, but you can overwrite the type of object your collection is manipulating to return something different — in my case, `AverageAgeDTO.class` as I’m not expecting a `PersonEntity` class here but a POJO that contains only the average age of my "persons".
## Swagger | md | {
"tags": [
"Java",
"Spring"
],
"pageDescription": "Take a shortcut to REST APIs with this Java/Spring Boot and MongoDB example application that embeds all you'll need to get going.",
"contentType": "Code Example"
} | REST APIs with Java, Spring Boot, and MongoDB | 2024-05-20T17:32:23.500Z | [
-0.03430325910449028,
0.004300291184335947,
-0.0027935784310102463,
-0.038299523293972015,
0.009678411297500134,
-0.027354616671800613,
0.037777598947286606,
0.026876822113990784,
-0.01637119986116886,
0.011425045318901539,
0.015342768281698227,
-0.0679003968834877,
0.03717166185379028,
0.... |
devcenter | https://www.mongodb.com/developer/code-examples/java/rest-apis-java-spring-boot | created | ## Swagger
Swagger is the tool you need to document your REST APIs. You have nothing to do — the configuration is completely automated. Just run the server and navigate to http://localhost:8080/swagger-ui.html. the interface will be waiting for you.
for more information.
## Nyan Cat
Yes, there is a Nyan Cat section in this post. Nyan Cat is love, and you need some Nyan Cat in your projects. :-)
Did you know that you can replace the Spring Boot logo in the logs with pretty much anything you want?
and the "Epic" font for each project name. It's easier to identify which log file I am currently reading.
## Conclusion
I hope you like my template, and I hope I will help you be more productive with MongoDB and the Java stack. | md | {
"tags": [
"Java",
"Spring"
],
"pageDescription": "Take a shortcut to REST APIs with this Java/Spring Boot and MongoDB example application that embeds all you'll need to get going.",
"contentType": "Code Example"
} | REST APIs with Java, Spring Boot, and MongoDB | 2024-05-20T17:32:23.500Z | [
-0.051025472581386566,
0.0020606087055057287,
0.008362272754311562,
-0.01653616689145565,
0.04463132098317146,
0.00867606233805418,
-0.00596108753234148,
0.03738109767436981,
-0.02173328772187233,
-0.016612080857157707,
-0.005427626892924309,
-0.05795499309897423,
0.04639716446399689,
0.05... |
devcenter | https://www.mongodb.com/developer/code-examples/java/rest-apis-java-spring-boot | created | ## Conclusion
I hope you like my template, and I hope I will help you be more productive with MongoDB and the Java stack.
If you see something which can be improved, please feel free to open a GitHub issue or directly submit a pull request. They are very welcome. :-)
If you are new to MongoDB Atlas, give our Quick Start post a try to get up to speed with MongoDB Atlas in no time.
[1]: https://images.contentstack.io/v3/assets/blt39790b633ee0d5a7/blt876f3404c57aa244/65388189377588ba166497b0/swaggerui.png
[2]: https://images.contentstack.io/v3/assets/blt39790b633ee0d5a7/bltf2f06ba5af19464d/65388188d31953242b0dbc6f/nyancat.png | md | {
"tags": [
"Java",
"Spring"
],
"pageDescription": "Take a shortcut to REST APIs with this Java/Spring Boot and MongoDB example application that embeds all you'll need to get going.",
"contentType": "Code Example"
} | REST APIs with Java, Spring Boot, and MongoDB | 2024-05-20T17:32:23.500Z | [
-0.0710957869887352,
0.011432668194174767,
-0.004561296198517084,
-0.005724127404391766,
0.02573459781706333,
-0.0029782771598547697,
0.010772356763482094,
0.0374654084444046,
-0.01542743295431137,
-0.02575158327817917,
-0.0037470050156116486,
-0.08644571900367737,
0.06435542553663254,
0.0... |
devcenter | https://www.mongodb.com/developer/languages/swift/halting-development-on-swift-driver | created | # Halting Development on MongoDB Swift Driver
MongoDB is halting development on our server-side Swift driver. We remain excited about Swift and will continue our development of our mobile Swift SDK.
We released our server-side Swift driver in 2020 as an open source project and are incredibly proud of the work that our engineering team has contributed to the Swift community over the last four years. Unfortunately, today we are announcing our decision to stop development of the MongoDB server-side Swift driver. We understand that this news may come as a disappointment to the community of current users.
There are still ways to use MongoDB with Swift:
- Use the MongoDB driver with server-side Swift applications as is
- Use the MongoDB C Driver directly in your server-side Swift projects
- Usage of another community Swift driver, mongokitten | md | {
"tags": [
"Swift",
"MongoDB"
],
"pageDescription": "The latest news regarding the MongoDB driver for Swift.",
"contentType": "News & Announcements"
} | Halting Development on MongoDB Swift Driver | 2024-05-20T17:32:23.500Z | [
-0.02048601023852825,
-0.010558489710092545,
0.035269901156425476,
-0.029257163405418396,
0.0026281876489520073,
-0.0030582291074097157,
-0.007431581616401672,
0.01594264805316925,
-0.02352409064769745,
-0.014450808987021446,
0.010975237935781479,
-0.026762347668409348,
0.023372916504740715,... |
devcenter | https://www.mongodb.com/developer/languages/swift/halting-development-on-swift-driver | created | Community members and developers are welcome to fork our existing driver and add features as you see fit - the Swift driver is under the Apache 2.0 license and source code is available on GitHub. For those developing client/mobile applications, MongoDB offers the Realm Swift SDK with real time sync to MongoDB Atlas.
We would like to take this opportunity to express our heartfelt appreciation for the enthusiastic support that the Swift community has shown for MongoDB. Your loyalty and feedback have been invaluable to us throughout our journey, and we hope to resume development on the server-side Swift driver in the future. | md | {
"tags": [
"Swift",
"MongoDB"
],
"pageDescription": "The latest news regarding the MongoDB driver for Swift.",
"contentType": "News & Announcements"
} | Halting Development on MongoDB Swift Driver | 2024-05-20T17:32:23.500Z | [
-0.04201998934149742,
-0.036628011614084244,
0.021529875695705414,
-0.02369876578450203,
0.014935408718883991,
0.003615425433963537,
0.012501300312578678,
0.03791311755776405,
-0.027779564261436462,
-0.04076854884624481,
0.02252265438437462,
-0.03456484153866768,
0.0034513145219534636,
0.0... |
devcenter | https://www.mongodb.com/developer/products/atlas/online-archive-query-performance | created | # Optimizing your Online Archive for Query Performance
## Contributed By
This article was contributed by Prem Krishna, a Senior Product Manager for Analytics at MongoDB.
## Introduction
With Atlas Online Archive, you can tier off cold data or infrequently accessed data from your MongoDB cluster to a MongoDB-managed cloud object storage - Amazon S3 or Microsoft Azure Blob Storage. This can lower the cost via archival cloud storage for old data, while active data that is more often accessed and queried remains in the primary database.
> FYI: If using Online Archive and also using MongoDB's Atlas Data Federation, users can also see a unified view of production data, and *archived data* side by side through a read-only, federated database instance.
In this blog, we are going to be discussing how to improve the performance of your online archive by choosing the correct partitioning fields. | md | {
"tags": [
"Atlas",
"AWS"
],
"pageDescription": "Get all the do's and don'ts around optimization of your data archival strategy.",
"contentType": "Article"
} | Optimizing your Online Archive for Query Performance | 2024-05-20T17:32:23.500Z | [
-0.031625352799892426,
-0.008480050601065159,
0.031467147171497345,
0.012029411271214485,
0.07238798588514328,
-0.012630869634449482,
-0.008400843478739262,
0.015860402956604958,
-0.04067423939704895,
-0.009654701687395573,
0.025164682418107986,
0.000007395433385681827,
0.03465186059474945,
... |
devcenter | https://www.mongodb.com/developer/products/atlas/online-archive-query-performance | created | In this blog, we are going to be discussing how to improve the performance of your online archive by choosing the correct partitioning fields.
## Why is partitioning so critical when configuring Online Archive?
Once you have started archiving data, you cannot edit any partition fields as the structure of how the data will be stored in the object storage becomes fixed after the archival job begins. Therefore, you'll want to think critically about your partitioning strategy beforehand.
Also, archival query performance is determined by how the data is structured in object storage, so it is important to not only choose the correct partitions but also choose the correct order of partitions. | md | {
"tags": [
"Atlas",
"AWS"
],
"pageDescription": "Get all the do's and don'ts around optimization of your data archival strategy.",
"contentType": "Article"
} | Optimizing your Online Archive for Query Performance | 2024-05-20T17:32:23.500Z | [
-0.03244185820221901,
0.00013426669465843588,
0.04931750148534775,
-0.036095451563596725,
0.07204870134592056,
0.008243436925113201,
-0.015980297699570656,
-0.004394145682454109,
-0.004748839419335127,
-0.0018426604801788926,
0.008875750936567783,
-0.009179570712149143,
0.035492319613695145,... |
devcenter | https://www.mongodb.com/developer/products/atlas/online-archive-query-performance | created | ## Do this...
**Choose the most frequently queried fields.** You can choose up to 2 partition fields for a custom query-based archive or up to three fields on a date-based online archive. Ensure that the most frequently queried fields for the archive are chosen. Note that we are talking about how you are going to query the archive and not the custom query criteria provided at the time of archiving!
**Check the order of partitioned fields.** While selecting the partitions is important, it is equally critical to choose the correct *order* of partitions. The most frequently queried field should be the first chosen partition field, followed by the second and third. That's simple enough. | md | {
"tags": [
"Atlas",
"AWS"
],
"pageDescription": "Get all the do's and don'ts around optimization of your data archival strategy.",
"contentType": "Article"
} | Optimizing your Online Archive for Query Performance | 2024-05-20T17:32:23.500Z | [
-0.048373546451330185,
0.010478157550096512,
0.02949291653931141,
-0.03198837488889694,
0.041628655046224594,
-0.017276842147111893,
-0.03300536051392555,
-0.01103147678077221,
-0.00147473462857306,
-0.008745572529733181,
0.006687208078801632,
-0.016997799277305603,
0.02684960886836052,
0.... |
devcenter | https://www.mongodb.com/developer/products/atlas/online-archive-query-performance | created | ## Not this
**Don't add irrelevant fields as partitions.** If you are not querying a specific field from the archive, then that field should not be added as a partition field. Remember that you can add a maximum of 2 or 3 partition fields, so it is important to choose these fields carefully based on how you query your archive.
**Don't ignore the “Move down” option.** The “Move down” option is applicable to an archive with a data-based rule. For example, if you want to query on Field_A the most, then Field_B, and then on exampleDate, ensure you are selecting the “Move Down” option next to the “Archive date field” on top. | md | {
"tags": [
"Atlas",
"AWS"
],
"pageDescription": "Get all the do's and don'ts around optimization of your data archival strategy.",
"contentType": "Article"
} | Optimizing your Online Archive for Query Performance | 2024-05-20T17:32:23.500Z | [
-0.049739446491003036,
0.006466949824243784,
0.04906553030014038,
-0.024883650243282318,
0.04166816547513008,
-0.011757899075746536,
-0.02664254605770111,
-0.0012480935547500849,
-0.0026517603546380997,
0.028028201311826706,
0.03241767734289169,
-0.039367277175188065,
0.05200689658522606,
... |
devcenter | https://www.mongodb.com/developer/products/atlas/online-archive-query-performance | created | **Don't choose high cardinality partition(s).** Choosing a high cardinality field such as `_id` will create a large number of partitions in the object storage. Then querying the archive for any aggregate based queries will cause increased latency. The same is applicable if multiple partitions are selected such that the collective fields when grouped together can be termed as high cardinality. For example, if you are selecting Field_A, Field_B and Field_C as your partitions and if a combination of these fields are creating unique values, then it will result in high cardinality partitions.
> Please note that this is **not applicable** for new Online Archives.
## Additional guidance
In addition to the partitioning guidelines, there are a couple of additional considerations that are relevant for the optimal configuration of your data archival strategy. | md | {
"tags": [
"Atlas",
"AWS"
],
"pageDescription": "Get all the do's and don'ts around optimization of your data archival strategy.",
"contentType": "Article"
} | Optimizing your Online Archive for Query Performance | 2024-05-20T17:32:23.500Z | [
-0.01170691940933466,
-0.0012132225092500448,
0.00652391416952014,
-0.010388919152319431,
0.06165760010480881,
-0.02153152972459793,
-0.019311871379613876,
0.015235300175845623,
0.00196845643222332,
-0.005715647712349892,
-0.017572252079844475,
-0.03132668882608414,
0.04173479974269867,
0.... |
devcenter | https://www.mongodb.com/developer/products/atlas/online-archive-query-performance | created | **Add data expiration rules and scheduled windows**
These fields are optional but are relevant for your use cases and can improve your archival speeds and for how long your data needs to be present in the archive.
**Index required fields**
Before archiving the data, ensure that your data is indexed for optimal performance. You can run an explain plan on the archival query to verify whether the archival rule will use an index.
## Conclusion
It is important to follow these do’s and don’ts before hitting “Begin Archiving” to archive your data so that the partitions are correctly configured thereby optimizing the performance of your online archives.
For more information on configuration or Online Archive, please see the documentation for setting up an Online Archive and our blog post on how to create an Online Archive.
Dig deeper into this topic with this tutorial. | md | {
"tags": [
"Atlas",
"AWS"
],
"pageDescription": "Get all the do's and don'ts around optimization of your data archival strategy.",
"contentType": "Article"
} | Optimizing your Online Archive for Query Performance | 2024-05-20T17:32:23.500Z | [
-0.04666883870959282,
0.00422804756090045,
0.059722308069467545,
-0.009661548770964146,
0.06599495559930801,
0.011287498287856579,
-0.03142751380801201,
0.00432452280074358,
-0.03840911015868187,
-0.017528830096125603,
0.02862926945090294,
-0.011607712134718895,
0.040696658194065094,
0.013... |
devcenter | https://www.mongodb.com/developer/products/atlas/online-archive-query-performance | created | Dig deeper into this topic with this tutorial.
✅ Already have an AWS account? Atlas supports paying for usage via the AWS Marketplace (AWS MP) without any upfront commitment — simply sign up for MongoDB Atlas via AWS Marketplace. | md | {
"tags": [
"Atlas",
"AWS"
],
"pageDescription": "Get all the do's and don'ts around optimization of your data archival strategy.",
"contentType": "Article"
} | Optimizing your Online Archive for Query Performance | 2024-05-20T17:32:23.500Z | [
-0.05158604681491852,
-0.052005890756845474,
0.0029884346295148134,
0.015366647392511368,
0.0006438301061280072,
-0.01108179334551096,
0.01832607574760914,
-0.007439817767590284,
-0.014531333930790424,
-0.009240147657692432,
0.03066110610961914,
-0.0543217658996582,
0.003617677139118314,
-... |
devcenter | https://www.mongodb.com/developer/products/atlas/using-confluent-cloud-atlas-stream-processing | created | # Using the Confluent Cloud with Atlas Stream Processing
> Atlas Stream Processing is now available. Learn more about it here.
Apache Kafka is a massively popular streaming platform today. It is available in the open-source community and also as software (e.g., Confluent Platform) for self-managing. Plus, you can get a hosted Kafka (or Kafka-compatible) service from a number of providers, including AWS Managed Streaming for Apache Kafka (MSK), RedPanda Cloud, and Confluent Cloud, to name a few.
In this tutorial, we will configure network connectivity between MongoDB Atlas Stream Processing instances and a topic within the Confluent Cloud. By the end of this tutorial, you will be able to process stream events from Confluent Cloud topics and emit the results back into a Confluent Cloud topic. | md | {
"tags": [
"Atlas"
],
"pageDescription": "Learn how to configure network connectivity between Confluent Cloud and MongoDB Atlas Stream Processing.",
"contentType": "Tutorial"
} | Using the Confluent Cloud with Atlas Stream Processing | 2024-05-20T17:32:23.500Z | [
-0.015456649474799633,
-0.05480249971151352,
-0.003914199769496918,
-0.020000917837023735,
0.03494960442185402,
-0.012897667475044727,
-0.0019430441316217184,
0.03312560170888901,
-0.028532303869724274,
-0.02998463623225689,
0.01608048379421234,
-0.04674335941672325,
0.012475202791392803,
... |
devcenter | https://www.mongodb.com/developer/products/atlas/using-confluent-cloud-atlas-stream-processing | created | Confluent Cloud dedicated clusters support connectivity through secure public internet endpoints with their Basic and Standard clusters. Private network connectivity options such as Private Link connections, VPC/VNet peering, and AWS Transit Gateway are available in the Enterprise and Dedicated cluster tiers.
**Note:** At the time of this writing, Atlas Stream Processing only supports internet-facing Basic and Standard Confluent Cloud clusters. This post will be updated to accommodate Enterprise and Dedicated clusters when support is provided for private networks.
The easiest way to get started with connectivity between Confluent Cloud and MongoDB Atlas is by using public internet endpoints. Public internet connectivity is the only option for Basic and Standard Confluent clusters. Rest assured that Confluent Cloud clusters with internet endpoints are protected by a proxy layer that prevents types of DoS, DDoS, SYN flooding, and other network-level attacks. We will also use authentication API keys with the SASL_SSL authentication method for secure credential exchange. | md | {
"tags": [
"Atlas"
],
"pageDescription": "Learn how to configure network connectivity between Confluent Cloud and MongoDB Atlas Stream Processing.",
"contentType": "Tutorial"
} | Using the Confluent Cloud with Atlas Stream Processing | 2024-05-20T17:32:23.500Z | [
-0.01947430521249771,
-0.06790168583393097,
-0.01760980300605297,
-0.01615196280181408,
0.016279540956020355,
0.0002706421946641058,
0.042080555111169815,
0.03153383359313011,
-0.031810909509658813,
-0.03453424200415611,
0.04952060431241989,
-0.05975239351391792,
0.045030705630779266,
-0.0... |
devcenter | https://www.mongodb.com/developer/products/atlas/using-confluent-cloud-atlas-stream-processing | created | In this tutorial, we will set up and configure Confluent Cloud and MongoDB Atlas for network connectivity and then work through a simple example that uses a sample data generator to stream data between MongoDB Atlas and Confluent Cloud.
## Tutorial prerequisites
This is what you’ll need to follow along:
- An Atlas project (free or paid tier)
- An Atlas database user with atlasAdmin permission
- For the purposes of this tutorial, we’ll have the user “tutorialuser.”
- MongoDB shell (Mongosh) version 2.0+
- Confluent Cloud cluster (any configuration)
## Configure Confluent Cloud
For this tutorial, you need a Confluent Cloud cluster created with a topic, “solardata,” and an API access key created. If you already have this, you may skip to Step 2. | md | {
"tags": [
"Atlas"
],
"pageDescription": "Learn how to configure network connectivity between Confluent Cloud and MongoDB Atlas Stream Processing.",
"contentType": "Tutorial"
} | Using the Confluent Cloud with Atlas Stream Processing | 2024-05-20T17:32:23.500Z | [
-0.017411526292562485,
-0.017036197707057,
-0.012765389867126942,
-0.04100590571761131,
0.029164718464016914,
-0.012195363640785217,
0.011741956695914268,
0.012938239611685276,
-0.03342513367533684,
-0.033147845417261124,
0.00614933529868722,
-0.05562334507703781,
0.04299475625157356,
0.00... |
devcenter | https://www.mongodb.com/developer/products/atlas/using-confluent-cloud-atlas-stream-processing | created | To create a Confluent Cloud cluster, log into the Confluent Cloud portal, select or create an environment for your cluster, and then click the “Add Cluster” button.
In this tutorial, we can use a **Basic** cluster type.
, click on “Stream Processing” from the Services menu. Next, click on the “Create Instance” button. Provide a name, cloud provider, and region. Note: For a lower network cost, choose the cloud provider and region that matches your Confluent Cloud cluster. In this tutorial, we will use AWS us-east-1 for both Confluent Cloud and MongoDB Atlas.
before continuing this tutorial.
Connection information can be found by clicking on the “Connect” button on your SPI. The connect dialog is similar to the connect dialog when connecting to an Atlas cluster. To connect to the SPI, you will need to use the **mongosh** command line tool.
. | md | {
"tags": [
"Atlas"
],
"pageDescription": "Learn how to configure network connectivity between Confluent Cloud and MongoDB Atlas Stream Processing.",
"contentType": "Tutorial"
} | Using the Confluent Cloud with Atlas Stream Processing | 2024-05-20T17:32:23.500Z | [
-0.032883692532777786,
-0.06877793371677399,
-0.018512757495045662,
-0.018576322123408318,
0.004193569533526897,
0.002793220104649663,
0.01884966716170311,
0.02706879936158657,
-0.039367493242025375,
-0.012674644589424133,
0.022237371653318405,
-0.10904789716005325,
0.01999366655945778,
-0... |
devcenter | https://www.mongodb.com/developer/products/atlas/using-confluent-cloud-atlas-stream-processing | created | .
> Log in today to get started. Atlas Stream Processing is now available to all developers in Atlas. Give it a try today! | md | {
"tags": [
"Atlas"
],
"pageDescription": "Learn how to configure network connectivity between Confluent Cloud and MongoDB Atlas Stream Processing.",
"contentType": "Tutorial"
} | Using the Confluent Cloud with Atlas Stream Processing | 2024-05-20T17:32:23.500Z | [
-0.020578589290380478,
-0.022939030081033707,
0.02774185687303543,
-0.026139581575989723,
-0.015020504593849182,
0.0011803517118096352,
0.012284724041819572,
0.00012355610670056194,
-0.02621406316757202,
-0.029760517179965973,
0.02622395195066929,
-0.0471661351621151,
-0.028404980897903442,
... |
devcenter | https://www.mongodb.com/developer/products/atlas/using-confluent-cloud-atlas-stream-processing | created | [1]: https://images.contentstack.io/v3/assets/blt39790b633ee0d5a7/bltcfb9c8a1f971ace1/652994177aecdf27ae595bf9/image24.png
[2]: https://images.contentstack.io/v3/assets/blt39790b633ee0d5a7/blt63a22c62ae627895/652994381e33730b6478f0d1/image5.png
[3]: https://images.contentstack.io/v3/assets/blt39790b633ee0d5a7/blte3f1138a6294748f/65299459382be57ed901d434/image21.png | md | {
"tags": [
"Atlas"
],
"pageDescription": "Learn how to configure network connectivity between Confluent Cloud and MongoDB Atlas Stream Processing.",
"contentType": "Tutorial"
} | Using the Confluent Cloud with Atlas Stream Processing | 2024-05-20T17:32:23.500Z | [
-0.038695868104696274,
-0.0021250087302178144,
0.03379495069384575,
-0.005874587222933769,
0.03419850021600723,
0.01730361208319664,
-0.006395284086465836,
0.021268386393785477,
-0.004587728995829821,
-0.018438884988427162,
0.02148045226931572,
-0.0932486429810524,
0.029424266889691353,
0.... |
devcenter | https://www.mongodb.com/developer/products/atlas/using-confluent-cloud-atlas-stream-processing | created | [4]: https://images.contentstack.io/v3/assets/blt39790b633ee0d5a7/blt3ccf2827c99f1c83/6529951a56a56b7388898ede/image19.png
[5]: https://images.contentstack.io/v3/assets/blt39790b633ee0d5a7/bltaea830d5730e5f51/652995402e91e47b2b547e12/image20.png
[6]: https://images.contentstack.io/v3/assets/blt39790b633ee0d5a7/blt9c425a65bb77f282/652995c0451768c2b6719c5f/image13.png | md | {
"tags": [
"Atlas"
],
"pageDescription": "Learn how to configure network connectivity between Confluent Cloud and MongoDB Atlas Stream Processing.",
"contentType": "Tutorial"
} | Using the Confluent Cloud with Atlas Stream Processing | 2024-05-20T17:32:23.500Z | [
-0.029414478689432144,
0.00812852568924427,
0.02437674254179001,
-0.02715199813246727,
0.03516842797398567,
0.012029444798827171,
-0.0014935190556570888,
0.007562947925180197,
0.008095341734588146,
-0.020646389573812485,
0.018168296664953232,
-0.08007343858480453,
0.032375518232584,
0.0209... |
devcenter | https://www.mongodb.com/developer/products/atlas/using-confluent-cloud-atlas-stream-processing | created | [7]: https://images.contentstack.io/v3/assets/blt39790b633ee0d5a7/blt2748832416fdcf8e/652996cd24aaaa5cb2e56799/image15.png
[8]: https://images.contentstack.io/v3/assets/blt39790b633ee0d5a7/blt9010c25a76edb010/652996f401c1899afe4a465b/image7.png
[9]: https://images.contentstack.io/v3/assets/blt39790b633ee0d5a7/blt27b3762b12b6b871/652997508adde5d1c8f78a54/image3.png | md | {
"tags": [
"Atlas"
],
"pageDescription": "Learn how to configure network connectivity between Confluent Cloud and MongoDB Atlas Stream Processing.",
"contentType": "Tutorial"
} | Using the Confluent Cloud with Atlas Stream Processing | 2024-05-20T17:32:23.500Z | [
-0.05470622703433037,
-0.00917544960975647,
0.03394338861107826,
-0.02006351761519909,
0.035498231649398804,
0.023062633350491524,
-0.004688433837145567,
0.02654476836323738,
0.007589690387248993,
-0.039025355130434036,
0.022938014939427376,
-0.09056271612644196,
0.04242580756545067,
0.028... |
devcenter | https://www.mongodb.com/developer/products/atlas/charts-javascript-sdk | created | Refresh
Only in USA | md | {
"tags": [
"Atlas",
"JavaScript"
],
"pageDescription": "Learn how to visualize your data with MongoDB Charts.",
"contentType": "Tutorial"
} | Working with MongoDB Charts and the New JavaScript SDK | 2024-05-20T17:32:23.500Z | [
-0.052861735224723816,
-0.017022034153342247,
0.030785616487264633,
0.021469678729772568,
0.046722445636987686,
0.024521060287952423,
-0.033396195620298386,
0.05982688069343567,
-0.020624719560146332,
-0.004901571199297905,
0.00907638855278492,
-0.031124966219067574,
0.009232334792613983,
... |
devcenter | https://www.mongodb.com/developer/products/atlas/how-send-mongodb-document-changes-slack-channel | created | # How to Send MongoDB Document Changes to a Slack Channel
In this tutorial, we will explore a seamless integration of your database with Slack using Atlas Triggers and the Slack API. Discover how to effortlessly send notifications to your desired Slack channels, effectively connecting the operations happening within your collections and relaying them in real-time updates.
The overall flow will be:
.
Once this has been completed, we are ready to start creating our first database trigger that will react every time there is an operation in a certain collection.
## Atlas trigger
For this tutorial, we will create a trigger that monitors all changes in a `test` collection for `insert`, `update`, and `delete` operations.
To create a new database trigger, you will need to: | md | {
"tags": [
"Atlas",
"JavaScript"
],
"pageDescription": "Learn how to use triggers in MongoDB Atlas to send information about changes to a document to Slack.",
"contentType": "Tutorial"
} | How to Send MongoDB Document Changes to a Slack Channel | 2024-05-20T17:32:23.500Z | [
-0.04861075058579445,
-0.005022979341447353,
0.018411647528409958,
-0.025806263089179993,
0.041048526763916016,
-0.016023894771933556,
-0.008149237371981144,
0.04268723726272583,
-0.016180451959371567,
-0.02218848466873169,
0.01085320021957159,
-0.027191177010536194,
0.006841033231467009,
... |
devcenter | https://www.mongodb.com/developer/products/atlas/how-send-mongodb-document-changes-slack-channel | created | To create a new database trigger, you will need to:
1. Click the **Data Services** tab in the top navigation of your screen if you haven't already navigated to Atlas.
2. Click **Triggers** in the left-hand navigation.
3. On the **Overview** tab of the **Triggers** page, click **Add Trigger** to open the trigger configuration page.
4. Enter the configuration values for the trigger and click **Save** at the bottom of the page.
Please note that this trigger will make use of the *event ordering* as we want the operations to be processed according to when they were performed.
The trigger configuration values will look like this:
using the UI, we need to:
1. Click the **Data Services** tab in the top navigation of your screen if you haven't already navigated to Atlas.
2. Click **Functions** in the left navigation menu. | md | {
"tags": [
"Atlas",
"JavaScript"
],
"pageDescription": "Learn how to use triggers in MongoDB Atlas to send information about changes to a document to Slack.",
"contentType": "Tutorial"
} | How to Send MongoDB Document Changes to a Slack Channel | 2024-05-20T17:32:23.500Z | [
-0.06363121420145035,
-0.011451954022049904,
-0.004165022633969784,
-0.04976668953895569,
-0.020153731107711792,
0.040829598903656006,
0.022447839379310608,
0.023276960477232933,
0.02600770816206932,
-0.05873127654194832,
0.0033851326443254948,
-0.017807122319936752,
0.031925491988658905,
... |
devcenter | https://www.mongodb.com/developer/products/atlas/how-send-mongodb-document-changes-slack-channel | created | 2. Click **Functions** in the left navigation menu.
3. Click **New Function** in the top right of the **Functions** page.
4. Enter a unique, identifying name for the function in the **Name** field.
5. Configure **User Authentication**. Functions in App Services always execute in the context of a specific application user or as a system user that bypasses rules. For this tutorial, we are going to use **System user**.
### "processEvent" function
The processEvent function will process the change events every time an operation we are monitoring in the given collection is processed. In this way, we are going to create an object that we will then send to the function in charge of sending this message in Slack.
The code of the function is the following:
```javascript
exports = function(changeEvent) {
const docId = changeEvent.documentKey._id; | md | {
"tags": [
"Atlas",
"JavaScript"
],
"pageDescription": "Learn how to use triggers in MongoDB Atlas to send information about changes to a document to Slack.",
"contentType": "Tutorial"
} | How to Send MongoDB Document Changes to a Slack Channel | 2024-05-20T17:32:23.500Z | [
-0.04692066088318825,
-0.0004975453484803438,
0.03182193636894226,
-0.07418668270111084,
-0.006427539046853781,
0.007713283412158489,
0.08849595487117767,
0.05095505341887474,
0.014411067590117455,
-0.007796722464263439,
-0.033935464918613434,
-0.011143178679049015,
0.013655604794621468,
0... |
devcenter | https://www.mongodb.com/developer/products/atlas/how-send-mongodb-document-changes-slack-channel | created | ```javascript
exports = function(changeEvent) {
const docId = changeEvent.documentKey._id;
const { updateDescription, operationType } = changeEvent;
var object = {
operationType,
docId,
};
if (updateDescription) {
const updatedFields = updateDescription.updatedFields; // A document containing updated fields
const removedFields = updateDescription.removedFields; // An array of removed fields
object = {
...object,
updatedFields,
removedFields
};
}
const result = context.functions.execute("sendToSlack", object);
return true;
};
```
In this function, we will create an object that we will then send as a parameter to another function that will be in charge of sending to our Slack channel.
Here we will use change event and its properties to capture the: | md | {
"tags": [
"Atlas",
"JavaScript"
],
"pageDescription": "Learn how to use triggers in MongoDB Atlas to send information about changes to a document to Slack.",
"contentType": "Tutorial"
} | How to Send MongoDB Document Changes to a Slack Channel | 2024-05-20T17:32:23.500Z | [
-0.034497227519750595,
-0.013948632404208183,
0.04669148847460747,
-0.043825455009937286,
0.03716447949409485,
-0.002747032791376114,
0.07384465634822845,
0.0626535639166832,
0.004230445716530085,
0.0033294830936938524,
-0.046797655522823334,
-0.027347128838300705,
0.0022223719861358404,
0... |
devcenter | https://www.mongodb.com/developer/products/atlas/how-send-mongodb-document-changes-slack-channel | created | Here we will use change event and its properties to capture the:
1. `_id` of the object that has been modified/inserted.
2. Operation that has been performed.
3. Fields of the object that have been modified or deleted when the operation has been an `update`.
With all this, we create an object and make use of the internal function calls to execute our `sendToSlack` function.
### "sendToSlack" function
This function will make use of the "chat.postMessage" method of the Slack API to send a message to a specific channel.
To use the Slack library, you must add it as a dependency in your Atlas function. Therefore, in the **Functions** section, we must go to the **Dependencies** tab and install `@slack/web-api`.
You will need to have a Slack token that will be used for creating the `WebClient` object as well as a Slack application. Therefore: | md | {
"tags": [
"Atlas",
"JavaScript"
],
"pageDescription": "Learn how to use triggers in MongoDB Atlas to send information about changes to a document to Slack.",
"contentType": "Tutorial"
} | How to Send MongoDB Document Changes to a Slack Channel | 2024-05-20T17:32:23.500Z | [
-0.05097302049398422,
0.0019437941955402493,
0.03132161125540733,
-0.040236834436655045,
0.01810695044696331,
-0.002948417793959379,
0.038074202835559845,
0.04767387732863426,
-0.014090952463448048,
0.001797234290279448,
0.010150925256311893,
-0.0475931279361248,
0.002250481629744172,
-0.0... |
devcenter | https://www.mongodb.com/developer/products/atlas/how-send-mongodb-document-changes-slack-channel | created | 1. Create or use an existing Slack app: This is necessary as the subsequent token we will need will be linked to a Slack App. For this step, you can navigate to the Slack application and use your credentials to authenticate and create or use an existing app you are a member of.
2. Within this app, we will need to create a bot token that will hold the authentication API key to send messages to the corresponding channel in the Slack app created. Please note that you will need to add as many authorization scopes on your token as you need, but the bare minimum is to add the `chat:write` scope to allow your app to post messages.
A full guide on how to get these two can be found in the Slack official documentation.
First, we will perform the logic with the received object to create a message adapted to the event that occurred. | md | {
"tags": [
"Atlas",
"JavaScript"
],
"pageDescription": "Learn how to use triggers in MongoDB Atlas to send information about changes to a document to Slack.",
"contentType": "Tutorial"
} | How to Send MongoDB Document Changes to a Slack Channel | 2024-05-20T17:32:23.500Z | [
-0.05435250699520111,
-0.036910705268383026,
-0.0005872239707969129,
-0.0655151978135109,
-0.009174820967018604,
0.013583007268607616,
0.028298940509557724,
0.04419535771012306,
-0.030170777812600136,
0.017300833016633987,
-0.01671484299004078,
-0.06589360535144806,
0.01871643215417862,
-0... |
devcenter | https://www.mongodb.com/developer/products/atlas/how-send-mongodb-document-changes-slack-channel | created | First, we will perform the logic with the received object to create a message adapted to the event that occurred.
```javascript
var message = "";
if (arg.operationType == 'insert') {
message += `A new document with id \`${arg.docId}\` has been inserted`;
} else if (arg.operationType == 'update') {
message += `The document \`${arg.docId}\` has been updated.`;
if (arg.updatedFields && Object.keys(arg.updatedFields).length > 0) {
message += ` The fileds ${JSON.stringify(arg.updatedFields)} has been modified.`;
}
if (arg.removedFields && arg.removedFields.length > 0) {
message += ` The fileds ${JSON.stringify(arg.removedFields)} has been removed.`;
}
} else {
message += `An unexpected operation affecting document \`${arg.docId}\` ocurred`;
}
``` | md | {
"tags": [
"Atlas",
"JavaScript"
],
"pageDescription": "Learn how to use triggers in MongoDB Atlas to send information about changes to a document to Slack.",
"contentType": "Tutorial"
} | How to Send MongoDB Document Changes to a Slack Channel | 2024-05-20T17:32:23.500Z | [
-0.08079223334789276,
0.0107524199411273,
0.02410375513136387,
-0.038039304316043854,
0.024088529869914055,
-0.01552523672580719,
0.059473130851984024,
0.047304026782512665,
0.005416723433881998,
0.010476653464138508,
-0.02539830096065998,
-0.0082736536860466,
-0.02138364501297474,
0.00159... |
devcenter | https://www.mongodb.com/developer/products/atlas/how-send-mongodb-document-changes-slack-channel | created | Once we have the library, we must use it to create a `WebClient` client that we will use later to make use of the methods we need.
```javascript
const { WebClient } = require('@slack/web-api');
// Read a token from the environment variables
const token = context.values.get('SLACK_TOKEN');
// Initialize
const app = new WebClient(token);
```
Finally, we can send our message with:
```javascript
try {
// Call the chat.postMessage method using the WebClient
const result = await app.chat.postMessage({
channel: channelId,
text: `New Event: ${message}`
});
console.log(result);
}
catch (error) {
console.error(error);
}
```
The full function code will be as:
```javascript
exports = async function(arg){ | md | {
"tags": [
"Atlas",
"JavaScript"
],
"pageDescription": "Learn how to use triggers in MongoDB Atlas to send information about changes to a document to Slack.",
"contentType": "Tutorial"
} | How to Send MongoDB Document Changes to a Slack Channel | 2024-05-20T17:32:23.500Z | [
-0.05858749896287918,
-0.023548534139990807,
0.005902126897126436,
-0.03897954151034355,
0.01843082718551159,
-0.004298172425478697,
0.018582908436655998,
0.05807711184024811,
-0.00891279336065054,
-0.029665406793355942,
-0.01096434984356165,
-0.05169147625565529,
-0.007530245464295149,
0.... |
devcenter | https://www.mongodb.com/developer/products/atlas/how-send-mongodb-document-changes-slack-channel | created | The full function code will be as:
```javascript
exports = async function(arg){
const { WebClient } = require('@slack/web-api');
// Read a token from the environment variables
const token = context.values.get('SLACK_TOKEN');
const channelId = context.values.get('CHANNEL_ID');
// Initialize
const app = new WebClient(token); | md | {
"tags": [
"Atlas",
"JavaScript"
],
"pageDescription": "Learn how to use triggers in MongoDB Atlas to send information about changes to a document to Slack.",
"contentType": "Tutorial"
} | How to Send MongoDB Document Changes to a Slack Channel | 2024-05-20T17:32:23.500Z | [
-0.06603668630123138,
-0.03279050067067146,
0.012644114904105663,
-0.0491916760802269,
0.04329785332083702,
0.0019121739314869046,
0.03035837598145008,
0.056558966636657715,
0.004621004220098257,
-0.024650365114212036,
-0.0052359928376972675,
-0.05651203542947769,
0.01625181920826435,
0.04... |
devcenter | https://www.mongodb.com/developer/products/atlas/how-send-mongodb-document-changes-slack-channel | created | var message = "";
if (arg.operationType == 'insert') {
message += `A new document with id \`${arg.docId}\` has been inserted`;
} else if (arg.operationType == 'update') {
message += `The document \`${arg.docId}\` has been updated.`;
if (arg.updatedFields && Object.keys(arg.updatedFields).length > 0) {
message += ` The fileds ${JSON.stringify(arg.updatedFields)} has been modified.`;
}
if (arg.removedFields && arg.removedFields.length > 0) {
message += ` The fileds ${JSON.stringify(arg.removedFields)} has been removed.`;
}
} else {
message += `An unexpected operation affecting document \`${arg.docId}\` ocurred`;
} | md | {
"tags": [
"Atlas",
"JavaScript"
],
"pageDescription": "Learn how to use triggers in MongoDB Atlas to send information about changes to a document to Slack.",
"contentType": "Tutorial"
} | How to Send MongoDB Document Changes to a Slack Channel | 2024-05-20T17:32:23.500Z | [
-0.08266864717006683,
-0.019241712987422943,
0.02353842556476593,
-0.02521379105746746,
0.027662349864840508,
-0.026150666177272797,
0.03329278901219368,
0.03873886168003082,
-0.00866732094436884,
-0.017852121964097023,
-0.015996860340237617,
-0.008567520417273045,
-0.009459811262786388,
0... |
devcenter | https://www.mongodb.com/developer/products/atlas/how-send-mongodb-document-changes-slack-channel | created | try {
// Call the chat.postMessage method using the WebClient
const result = await app.chat.postMessage({
channel: channelId,
text: `New Event: ${message}`
});
console.log(result);
}
catch (error) {
console.error(error);
}
};
```
Note: The bot token we use must have the minimum permissions to send messages to a certain channel. We must also have the application created in Slack added to the channel where we want to receive the messages.
If everything is properly configured, every change in the collection and monitored operations will be received in the Slack channel:
to only detect certain changes and then adapt the change event to only receive certain fields with a "$project".
## Conclusion | md | {
"tags": [
"Atlas",
"JavaScript"
],
"pageDescription": "Learn how to use triggers in MongoDB Atlas to send information about changes to a document to Slack.",
"contentType": "Tutorial"
} | How to Send MongoDB Document Changes to a Slack Channel | 2024-05-20T17:32:23.500Z | [
-0.038253337144851685,
0.00445207953453064,
0.008817768655717373,
-0.06323850154876709,
0.024260761216282845,
-0.004889675881713629,
0.02167149819433689,
0.04973961040377617,
-0.040696412324905396,
-0.0053010256960988045,
-0.01145285926759243,
-0.04720069468021393,
-0.0007305164472199976,
... |
devcenter | https://www.mongodb.com/developer/products/atlas/how-send-mongodb-document-changes-slack-channel | created | to only detect certain changes and then adapt the change event to only receive certain fields with a "$project".
## Conclusion
In this tutorial, we've learned how to seamlessly integrate your database with Slack using Atlas Triggers and the Slack API. This integration allows you to send real-time notifications to your Slack channels, keeping your team informed about important operations within your database collections.
We started by creating a new application in Atlas and then set up a database trigger that reacts to specific collection operations. We explored the `processEvent` function, which processes change events and prepares the data for Slack notifications. Through a step-by-step process, we demonstrated how to create a message and use the Slack API to post it to a specific channel.
Now that you've grasped the basics, it's time to take your integration skills to the next level. Here are some steps you can follow: | md | {
"tags": [
"Atlas",
"JavaScript"
],
"pageDescription": "Learn how to use triggers in MongoDB Atlas to send information about changes to a document to Slack.",
"contentType": "Tutorial"
} | How to Send MongoDB Document Changes to a Slack Channel | 2024-05-20T17:32:23.500Z | [
-0.04670480266213417,
-0.008130618371069431,
0.02248843014240265,
-0.03379162400960922,
0.04564126580953598,
-0.024753117933869362,
0.017661316320300102,
0.028665034100413322,
-0.005296927411109209,
-0.022395316511392593,
0.018970653414726257,
0.014520364813506603,
-0.001965910429134965,
0... |
devcenter | https://www.mongodb.com/developer/products/atlas/how-send-mongodb-document-changes-slack-channel | created | - **Explore advanced use cases**: Consider how you can adapt the principles you've learned to more complex scenarios within your organization. Whether it's custom notifications or handling specific database events, there are countless possibilities.
- **Dive into the Slack API documentation**: For a deeper understanding of what's possible with Slack's API, explore their official documentation. This will help you harness the full potential of Slack's features.
By taking these steps, you'll be well on your way to creating powerful, customized integrations that can streamline your workflow and keep your team in the loop with real-time updates. Good luck with your integration journey! | md | {
"tags": [
"Atlas",
"JavaScript"
],
"pageDescription": "Learn how to use triggers in MongoDB Atlas to send information about changes to a document to Slack.",
"contentType": "Tutorial"
} | How to Send MongoDB Document Changes to a Slack Channel | 2024-05-20T17:32:23.500Z | [
-0.06648917496204376,
-0.03259650990366936,
0.04062449932098389,
-0.032923996448516846,
0.04692851006984711,
-0.013970009982585907,
0.04763020947575569,
0.015161968767642975,
-0.017257830128073692,
-0.0023698739241808653,
0.01546394545584917,
0.0043011815287172794,
0.006748454179614782,
-0... |
devcenter | https://www.mongodb.com/developer/products/atlas/how-send-mongodb-document-changes-slack-channel | created | [1]: https://images.contentstack.io/v3/assets/blt39790b633ee0d5a7/blt8fcfb82094f04d75/653816cde299fbd2960a4695/image2.png
[2]: https://images.contentstack.io/v3/assets/blt39790b633ee0d5a7/bltc7874f54dc0cd8be/653816e70d850608a2f05bb9/image3.png
[3]: https://images.contentstack.io/v3/assets/blt39790b633ee0d5a7/blt99aaf337d37c41ae/653816fd2c35813636b3a54d/image1.png | md | {
"tags": [
"Atlas",
"JavaScript"
],
"pageDescription": "Learn how to use triggers in MongoDB Atlas to send information about changes to a document to Slack.",
"contentType": "Tutorial"
} | How to Send MongoDB Document Changes to a Slack Channel | 2024-05-20T17:32:23.500Z | [
-0.04119512438774109,
-0.01053551584482193,
0.03624974191188812,
-0.006374048534780741,
0.03306173160672188,
0.02235942706465721,
-0.006340079940855503,
0.020511558279395103,
-0.0024089061189442873,
-0.02352253533899784,
0.014978361316025257,
-0.09041383862495422,
0.03291108459234238,
0.01... |
devcenter | https://www.mongodb.com/developer/products/mongodb/doc-modeling-vector-search | created | # How to Model Your Documents for Vector Search
Atlas Vector Search was recently released, so let’s dive into a tutorial on how to properly model your documents when utilizing vector search to revolutionize your querying capabilities!
## Data modeling normally in MongoDB
Vector search is new, so let’s first go over the basic ways of modeling your data in a MongoDB document before continuing on into how to incorporate vector embeddings.
Data modeling in MongoDB revolves around organizing your data into documents within various collections. Varied projects or organizations will require different ways of structuring data models due to the fact that successful data modeling depends on the specific requirements of each application, and for the most part, no one document design can be applied for every situation. There are some commonalities, though, that can guide the user. These are: | md | {
"tags": [
"MongoDB",
"AI"
],
"pageDescription": "Follow along with this comprehensive tutorial on how to properly model your documents for MongoDB Vector Search.",
"contentType": "Tutorial"
} | How to Model Your Documents for Vector Search | 2024-05-20T17:32:23.500Z | [
-0.05040832981467247,
0.0035094867926090956,
0.02393709309399128,
-0.029764365404844284,
0.05191889777779579,
-0.006890544667840004,
-0.004012200981378555,
0.045998383313417435,
-0.021268324926495552,
-0.007306348532438278,
0.019453100860118866,
-0.03377308323979378,
0.010670991614460945,
... |
devcenter | https://www.mongodb.com/developer/products/mongodb/doc-modeling-vector-search | created | 1. Choosing whether to embed or reference your related data.
2. Using arrays in a document.
3. Indexing your documents (finding fields that are frequently used and applying the appropriate indexing, etc.).
For a more in-depth explanation and a comprehensive guide of data modeling with MongoDB, please check out our data modeling article.
## Setting up an example data model
We are going to be building our vector embedding example using a MongoDB document for our MongoDB TV series. Here, we have a single MongoDB document representing our MongoDB TV show, without any embeddings in place. We have a nested array featuring our array of seasons, and within that, our array of different episodes. This way, in our document, we are capable of seeing exactly which season each episode is a part of, along with the episode number, the title, the description, and the date: | md | {
"tags": [
"MongoDB",
"AI"
],
"pageDescription": "Follow along with this comprehensive tutorial on how to properly model your documents for MongoDB Vector Search.",
"contentType": "Tutorial"
} | How to Model Your Documents for Vector Search | 2024-05-20T17:32:23.500Z | [
-0.02317127026617527,
-0.0021741336677223444,
0.029314830899238586,
-0.05016374960541725,
0.07130555808544159,
-0.008104021660983562,
0.003823608625680208,
0.044665563851594925,
0.0007575505296699703,
0.010353545658290386,
0.0147004509344697,
-0.0447501540184021,
0.01832723245024681,
0.018... |
devcenter | https://www.mongodb.com/developer/products/mongodb/doc-modeling-vector-search | created | ```
{
"_id": ObjectId("238478293"),
"title": "MongoDB TV",
"description": "All your MongoDB updates, news, videos, and podcast episodes, straight to you!",
"genre": "Programming", "Database", "MongoDB"],
"seasons": [
{
"seasonNumber": 1,
"episodes": [
{
"episodeNumber": 1,
"title": "EASY: Build Generative AI Applications",
"description": "Join Jesse Hall….",
"date": ISODate("Oct52023")
},
{
"episodeNumber": 2,
"title": "RAG Architecture & MongoDB: The Future of Generative AI Apps",
"description": "Join Prakul Agarwal…", | md | {
"tags": [
"MongoDB",
"AI"
],
"pageDescription": "Follow along with this comprehensive tutorial on how to properly model your documents for MongoDB Vector Search.",
"contentType": "Tutorial"
} | How to Model Your Documents for Vector Search | 2024-05-20T17:32:23.500Z | [
-0.02710265852510929,
-0.011942173354327679,
0.03338083624839783,
-0.08127890527248383,
0.07385067641735077,
-0.009587489068508148,
0.004387813154608011,
0.01023946888744831,
-0.00046368289622478187,
0.031469572335481644,
-0.016225295141339302,
-0.0647592544555664,
0.03261403739452362,
0.0... |
devcenter | https://www.mongodb.com/developer/products/mongodb/doc-modeling-vector-search | created | "description": "Join Prakul Agarwal…",
"date": ISODate("Oct42023")
}
]
},
{
"seasonNumber": 2,
"episodes": [
{
"episodeNumber": 1,
"title": "Cloud Connect - Harness the Power of AI/ML and Generative AI on AWS with MongoDB Atlas",
"description": "Join Igor Alekseev….",
"date": ISODate("Oct32023")
},
{
"episodeNumber": 2,
"title": "The Index: Here’s what you missed last week…",
"description": "Join Megan Grant…",
"date": ISODate("Oct22023")
}
]
}
]
}
``` | md | {
"tags": [
"MongoDB",
"AI"
],
"pageDescription": "Follow along with this comprehensive tutorial on how to properly model your documents for MongoDB Vector Search.",
"contentType": "Tutorial"
} | How to Model Your Documents for Vector Search | 2024-05-20T17:32:23.500Z | [
-0.06628566235303879,
-0.03930354490876198,
0.011942955665290356,
-0.043195690959692,
0.07272123545408249,
0.02715747244656086,
-0.004824345000088215,
0.02833166904747486,
0.0046721044927835464,
0.006590971257537603,
0.03282574191689491,
-0.04673938825726509,
0.04389127716422081,
0.0194557... |
devcenter | https://www.mongodb.com/developer/products/mongodb/doc-modeling-vector-search | created | Now that we have our example set up, let’s incorporate vector embeddings and discuss the proper techniques to set you up for success.
## Integrating vector embeddings for vector search in our data model
Let’s first understand exactly what vector search is: Vector search is the way to search based on *meaning* rather than specific words. This comes in handy when querying using similarities rather than searching based on keywords. When using vector search, you can query using a question or a phrase rather than just a word. In a nutshell, vector search is great for when you can’t think of *exactly* that book or movie, but you remember the plot or the climax. | md | {
"tags": [
"MongoDB",
"AI"
],
"pageDescription": "Follow along with this comprehensive tutorial on how to properly model your documents for MongoDB Vector Search.",
"contentType": "Tutorial"
} | How to Model Your Documents for Vector Search | 2024-05-20T17:32:23.500Z | [
-0.06387308239936829,
-0.01285748090595007,
0.00989945326000452,
-0.011655112728476524,
0.04453819990158081,
-0.0001472049771109596,
0.014776562340557575,
0.038629353046417236,
0.012935629114508629,
-0.0017955395160242915,
0.035523075610399246,
-0.031217288225889206,
0.04906724765896797,
0... |
devcenter | https://www.mongodb.com/developer/products/mongodb/doc-modeling-vector-search | created | This process happens when text, video, or audio is transformed via an encoder into vectors. With MongoDB, we can do this using OpenAI, Hugging Face, or other natural language processing models. Once we have our vectors, we can upload them in the base of our document and conduct vector search using them. Please keep in mind the [current limitations of vector search and how to properly embed your vectors.
You can store your vector embeddings alongside other data in your document, or you can store them in a new collection. It is really up to the user and the project goals. Let’s go over what a document with vector embeddings can look like when you incorporate them into your data model, using the same example from above: | md | {
"tags": [
"MongoDB",
"AI"
],
"pageDescription": "Follow along with this comprehensive tutorial on how to properly model your documents for MongoDB Vector Search.",
"contentType": "Tutorial"
} | How to Model Your Documents for Vector Search | 2024-05-20T17:32:23.500Z | [
-0.048436153680086136,
-0.021079035475850105,
0.003924472723156214,
-0.022311773151159286,
0.05012430250644684,
-0.004514416214078665,
-0.003736359067261219,
0.021510127931833267,
-0.0028633545152843,
0.014669610187411308,
0.014674393460154533,
-0.034258510917425156,
0.029364895075559616,
... |
devcenter | https://www.mongodb.com/developer/products/mongodb/doc-modeling-vector-search | created | ```
{
"_id": ObjectId("238478293"),
"title": "MongoDB TV",
"description": "All your MongoDB updates, news, videos, and podcast episodes, straight to you!",
"genre": "Programming", "Database", "MongoDB"],
“vectorEmbeddings”: [ 0.25, 0.5, 0.75, 0.1, 0.1, 0.8, 0.2, 0.6, 0.6, 0.4, 0.9, 0.3, 0.2, 0.7, 0.5, 0.8, 0.1, 0.8, 0.2, 0.6 ],
"seasons": [
{
"seasonNumber": 1,
"episodes": [ | md | {
"tags": [
"MongoDB",
"AI"
],
"pageDescription": "Follow along with this comprehensive tutorial on how to properly model your documents for MongoDB Vector Search.",
"contentType": "Tutorial"
} | How to Model Your Documents for Vector Search | 2024-05-20T17:32:23.500Z | [
-0.03143370524048805,
-0.01130755990743637,
0.03120499663054943,
-0.053648293018341064,
0.08467394858598709,
-0.0117659792304039,
0.008281460031867027,
0.03438279405236244,
-0.0023252954706549644,
0.03939914330840111,
-0.01041195634752512,
-0.06258431077003479,
0.034844547510147095,
0.0336... |
devcenter | https://www.mongodb.com/developer/products/mongodb/doc-modeling-vector-search | created | "seasons": [
{
"seasonNumber": 1,
"episodes": [
{
"episodeNumber": 1,
"title": "EASY: Build Generative AI Applications",
"description": "Join Jesse Hall….",
"date": ISODate("Oct 5, 2023")
},
{
"episodeNumber": 2,
"title": "RAG Architecture & MongoDB: The Future of Generative AI Apps",
"description": "Join Prakul Agarwal…",
"date": ISODate("Oct 4, 2023")
}
]
},
{
"seasonNumber": 2,
"episodes": [
{
"episodeNumber": 1, | md | {
"tags": [
"MongoDB",
"AI"
],
"pageDescription": "Follow along with this comprehensive tutorial on how to properly model your documents for MongoDB Vector Search.",
"contentType": "Tutorial"
} | How to Model Your Documents for Vector Search | 2024-05-20T17:32:23.500Z | [
-0.05439312011003494,
-0.005806506611406803,
0.05345169082283974,
-0.08027718961238861,
0.05994939059019089,
0.033025771379470825,
-0.0029064291156828403,
-0.0010950734140351415,
0.013906036503612995,
0.017406273633241653,
0.011048182845115662,
-0.05471319332718849,
0.04402558505535126,
0.... |
devcenter | https://www.mongodb.com/developer/products/mongodb/doc-modeling-vector-search | created | "seasonNumber": 2,
"episodes": [
{
"episodeNumber": 1,
"title": "Cloud Connect - Harness the Power of AI/ML and Generative AI on AWS with MongoDB Atlas",
"description": "Join Igor Alekseev….",
"date": ISODate("Oct 3, 2023")
},
{
"episodeNumber": 2,
"title": "The Index: Here’s what you missed last week…",
"description": "Join Megan Grant…",
"date": ISODate("Oct 2, 2023")
}
]
}
]
}
``` | md | {
"tags": [
"MongoDB",
"AI"
],
"pageDescription": "Follow along with this comprehensive tutorial on how to properly model your documents for MongoDB Vector Search.",
"contentType": "Tutorial"
} | How to Model Your Documents for Vector Search | 2024-05-20T17:32:23.500Z | [
-0.05909251049160957,
-0.03766361251473427,
0.015219616703689098,
-0.043514326214790344,
0.07963179051876068,
0.022256160154938698,
0.001884619938209653,
0.019050676375627518,
0.009522159583866596,
0.016378147527575493,
0.028501447290182114,
-0.04457075148820877,
0.03775762766599655,
0.012... |
devcenter | https://www.mongodb.com/developer/products/mongodb/doc-modeling-vector-search | created | }
]
}
]
}
```
Here, you have your vector embeddings classified at the base in your document. Currently, there is a limitation where vector embeddings cannot be nested in an array in your document. Please ensure your document has your embeddings at the base. There are various tutorials on our [Developer Center, alongside our YouTube account and our documentation, that can help you figure out how to embed these vectors into your document and how to acquire the necessary vectors in the first place. | md | {
"tags": [
"MongoDB",
"AI"
],
"pageDescription": "Follow along with this comprehensive tutorial on how to properly model your documents for MongoDB Vector Search.",
"contentType": "Tutorial"
} | How to Model Your Documents for Vector Search | 2024-05-20T17:32:23.500Z | [
-0.020648935809731483,
-0.049511369317770004,
0.012658742256462574,
-0.04835701361298561,
0.04308772832155228,
0.023603098466992378,
-0.007924037985503674,
0.06322376430034637,
-0.007988965138792992,
-0.02596079558134079,
0.019380705431103706,
-0.051357775926589966,
0.031482573598623276,
0... |
devcenter | https://www.mongodb.com/developer/products/mongodb/doc-modeling-vector-search | created | ## Extras: Indexing with vector search
When you’re using vector search, it is necessary to create a search index so you’re able to be successful with your semantic search. To do this, please view our Vector Search documentation. Here is the skeleton code provided by our documentation:
```
{
"fields":
{
"type": "vector",
"path": "",
"numDimensions": ,
"similarity": "euclidean | cosine | dotProduct"
},
{
"type": "filter",
"path": ""
},
...
]
}
``` | md | {
"tags": [
"MongoDB",
"AI"
],
"pageDescription": "Follow along with this comprehensive tutorial on how to properly model your documents for MongoDB Vector Search.",
"contentType": "Tutorial"
} | How to Model Your Documents for Vector Search | 2024-05-20T17:32:23.500Z | [
-0.07858610898256302,
-0.03655095398426056,
0.024139143526554108,
-0.011957501992583275,
0.05302802473306656,
-0.026552822440862656,
0.011656902730464935,
0.04356744512915611,
-0.0245579332113266,
0.012442640028893948,
0.04635108634829521,
-0.04989326745271683,
0.008488439954817295,
0.0167... |
devcenter | https://www.mongodb.com/developer/products/mongodb/doc-modeling-vector-search | created | When setting up your search index, you want to change the “” to be your vector path. In our case, it would be “vectorEmbeddings”. “type” can stay the way it is. For “numDimensions”, please match the dimensions of the model you’ve chosen. This is just the number of vector dimensions, and the value cannot be greater than 4096. This limitation comes from the base embedding model that is being used, so please ensure you’re using a supported LLM (large language model) such as OpenAI or Hugging Face. When using one of these, there won’t be any issues running into vector dimensions. For “similarity”, please pick which vector function you want to use to search for the top K-nearest neighbors.
## Extras: Querying with vector search | md | {
"tags": [
"MongoDB",
"AI"
],
"pageDescription": "Follow along with this comprehensive tutorial on how to properly model your documents for MongoDB Vector Search.",
"contentType": "Tutorial"
} | How to Model Your Documents for Vector Search | 2024-05-20T17:32:23.500Z | [
-0.05278448015451431,
-0.04473148658871651,
0.03215004503726959,
0.016879158094525337,
0.06253910064697266,
0.006722410675138235,
0.003707692725583911,
0.007163537200540304,
-0.008732017129659653,
-0.010686831548810005,
0.048368312418460846,
-0.05051687732338905,
0.03182551637291908,
0.043... |
devcenter | https://www.mongodb.com/developer/products/mongodb/doc-modeling-vector-search | created | ## Extras: Querying with vector search
When you’re ready to query and find results from your embedded documents, it’s time to create an aggregation pipeline on your embedded vector data. To do this, you can use the“$vectorSearch” operator, which is a new aggregation stage in Atlas. It helps execute an Approximate Nearest Neighbor query.
For more information on this step, please check out the tutorial on Developer Center about [building generative AI applications, and our YouTube video on vector search. | md | {
"tags": [
"MongoDB",
"AI"
],
"pageDescription": "Follow along with this comprehensive tutorial on how to properly model your documents for MongoDB Vector Search.",
"contentType": "Tutorial"
} | How to Model Your Documents for Vector Search | 2024-05-20T17:32:23.500Z | [
-0.08270145207643509,
-0.02409922517836094,
0.04185096547007561,
-0.0005988531047478318,
0.011469149962067604,
-0.005833218805491924,
0.017487570643424988,
0.027966871857643127,
-0.01327602006494999,
0.0018545862985774875,
0.040629949420690536,
-0.01056613028049469,
0.02905406802892685,
-0... |
devcenter | https://www.mongodb.com/developer/code-examples/python/dog-care-example-app | created | # Example Application for Dog Care Providers (DCP)
## Creator
Radvile Razmute contributed this project.
## About the project
My project explores how to use MongoDB Shell, MongoDB Atlas, and MongoDB Compass. This project aimed to develop a database for dog care providers and demonstrate how this data can be manipulated in MongoDB. The Dog Welfare Federation (DWF) is concerned that some providers who provide short/medium term care for dogs when the owner is unable to – e.g., when away on holidays, may not be delivering the service they promise. Up to now, the DWF has managed the data using a SQL database. As the scale of its operations expanded, the organization needed to invest in a cloud database application. As an alternative to the relational SQL database, the Dog Welfare Federation decided to look at the database development using MongoDB services. | md | {
"tags": [
"Python",
"MongoDB"
],
"pageDescription": " Learn MongoDB by creating a database for dog care providers!",
"contentType": "Code Example"
} | Example Application for Dog Care Providers (DCP) | 2024-05-20T17:32:23.500Z | [
-0.0673043504357338,
-0.024489762261509895,
0.021226804703474045,
-0.056178152561187744,
0.014945073053240776,
-0.022923143580555916,
0.007646982558071613,
0.01873447746038437,
-0.015842663124203682,
-0.03195275366306305,
0.012720059603452682,
-0.045217279344797134,
0.03697912022471428,
0.... |
devcenter | https://www.mongodb.com/developer/code-examples/python/dog-care-example-app | created | The Dog database uses fictitious data that I have created myself. The different practical stages of the project have been documented in my project report and may guide the beginners taking their first steps into MongoDB.
## Inspiration
The assignment was given to me by my lecturer. And when he was deciding on the topics for the project, he knew that I love dogs. And that's why my project was all about the dogs. Even though the lecturer gave me the assignment, it was my idea to prepare this project in a way that does not only benefit me. | md | {
"tags": [
"Python",
"MongoDB"
],
"pageDescription": " Learn MongoDB by creating a database for dog care providers!",
"contentType": "Code Example"
} | Example Application for Dog Care Providers (DCP) | 2024-05-20T17:32:23.500Z | [
-0.03518880531191826,
-0.0029791968408972025,
0.028004679828882217,
-0.05337212234735489,
0.03973725810647011,
-0.026476705446839333,
0.04751341789960861,
0.039526376873254776,
0.017678383737802505,
-0.0029798562172800303,
0.010395782999694347,
-0.08637396991252899,
0.04233550280332565,
0.... |
devcenter | https://www.mongodb.com/developer/code-examples/python/dog-care-example-app | created | When I followed courses via MongoDB University, I noticed that these courses gave me a flavor of MongoDB, but not the basic concepts. I wanted to turn a database development project into a kind of a guide for somebody who never used MongoDB and who actually can take the project say: "Okay, these are the basic concepts, this is what happens when you run the query, this is the result of what you get, and this is how you can validate that your result and your query is correct." So that's how the whole MongoDB project for beginners was born.
My guide tells you how to use MongoDB, what steps you need to follow to create an application, upload data, use the data, etc. It's one thing to know what those operators are doing, but it's an entirely different thing to understand how they connect and what impact they make. | md | {
"tags": [
"Python",
"MongoDB"
],
"pageDescription": " Learn MongoDB by creating a database for dog care providers!",
"contentType": "Code Example"
} | Example Application for Dog Care Providers (DCP) | 2024-05-20T17:32:23.500Z | [
-0.034124553203582764,
0.015286148525774479,
0.012402825057506561,
-0.04913154989480972,
0.01951519399881363,
-0.03151549771428108,
0.02665543183684349,
0.02972453460097313,
-0.044772278517484665,
0.009785681031644344,
-0.003913190681487322,
-0.027158258482813835,
0.05063997209072113,
0.03... |
devcenter | https://www.mongodb.com/developer/code-examples/python/dog-care-example-app | created | ## Why MongoDB?
My lecturer Noel Tierney, a lecturer in Computer Applications in Athlone Institute of Technology, Ireland, gave me the assignment to use MongoDB. He gave them instructions on the project and what kind of outcome he would like to see. I was asked to use MongoDB, and I decided to dive deeper into everything the platform offers. Besides that, as I mentioned briefly in the introduction: the organization DWF was planning on scaling and expanding their business, and they wanted to look into database development with MongoDB. This was a good chance for me to learn everything about NoSQL. | md | {
"tags": [
"Python",
"MongoDB"
],
"pageDescription": " Learn MongoDB by creating a database for dog care providers!",
"contentType": "Code Example"
} | Example Application for Dog Care Providers (DCP) | 2024-05-20T17:32:23.500Z | [
-0.06260152906179428,
-0.0006223433883860707,
0.0036406510043889284,
-0.04687513783574104,
0.045890241861343384,
-0.01253566425293684,
0.0499887578189373,
0.033365845680236816,
-0.021059362217783928,
-0.023712217807769775,
-0.026301216334104538,
-0.048075538128614426,
0.016038881614804268,
... |
devcenter | https://www.mongodb.com/developer/code-examples/python/dog-care-example-app | created | ## How it works
The project teaches you how to set up a MongoDB database for dog care providers. It includes three main sections, including MongoDB Shell, MongoDB Atlas, and MongoDB Compass. The MongoDB Shell section demonstrates how the data can be manipulated using simple queries and the aggregation method. I'm discussing how to import data into a local cluster, create queries, and retrieve & update queries. The other two areas include an overview of MongoDB Atlas and MongoDB Compass; I also discuss querying and the aggregation framework per topic. Each section shows step-by-step instructions on how to set up the application and how also to include some data manipulation examples. As mentioned above, I created all the sample data myself, which was a ton of work! I made a spreadsheet with 2000 different lines of sample data. To do that, I had to Google dog breeds, dog names, and their temperaments. I wanted it to be close to reality. | md | {
"tags": [
"Python",
"MongoDB"
],
"pageDescription": " Learn MongoDB by creating a database for dog care providers!",
"contentType": "Code Example"
} | Example Application for Dog Care Providers (DCP) | 2024-05-20T17:32:23.500Z | [
-0.03803024813532829,
-0.002202673116698861,
0.027613164857029915,
-0.03669954091310501,
0.021218925714492798,
-0.026672158390283585,
0.003918041475117207,
0.016388019546866417,
-0.008963847532868385,
0.009202348068356514,
0.021327244117856026,
-0.0764559730887413,
0.047654666006565094,
0.... |
devcenter | https://www.mongodb.com/developer/code-examples/python/dog-care-example-app | created | ## Challenges and learning
When I started working with MongoDB, the first big thing that I had to get over was the braces everywhere. So it was quite challenging for me to understand where the query finishes. But I’ve been reading a lot of documentation, and creating this guide gave me quite a good understanding of the basics of MongoDB. I learned a lot about the technical side of databases because I was never familiar with them; I even had no idea how it works. Using MongoDB and learning about MongoDB, and using MongoDB was a great experience. When I had everything set up: the MongoDB shell, Compass, and Atlas, I could see how that information is moving between all these different environments, and that was awesome. I think it worked quite well. I hope that my guide will be valuable for new learners. It demonstrates that users like me, who had no prior skills in using MongoDB, can quickly become MongoDB developers. | md | {
"tags": [
"Python",
"MongoDB"
],
"pageDescription": " Learn MongoDB by creating a database for dog care providers!",
"contentType": "Code Example"
} | Example Application for Dog Care Providers (DCP) | 2024-05-20T17:32:23.500Z | [
-0.08073952794075012,
0.016111226752400398,
0.022742534056305885,
-0.02772919088602066,
0.031608451157808304,
-0.007856469601392746,
0.018731724470853806,
0.02807675674557686,
-0.024212004616856575,
-0.018364524468779564,
-0.0006483024917542934,
-0.03308921679854393,
0.033242009580135345,
... |
devcenter | https://www.mongodb.com/developer/code-examples/python/dog-care-example-app | created | Access the complete report, which includes the queries you need - here. | md | {
"tags": [
"Python",
"MongoDB"
],
"pageDescription": " Learn MongoDB by creating a database for dog care providers!",
"contentType": "Code Example"
} | Example Application for Dog Care Providers (DCP) | 2024-05-20T17:32:23.500Z | [
-0.011670069769024849,
0.001684882096014917,
0.02273530513048172,
-0.0020611488725990057,
0.0173810962587595,
0.04146561026573181,
0.03775247931480408,
0.05070570856332779,
-0.012176265008747578,
0.031127506867051125,
-0.007994591258466244,
-0.043679725378751755,
0.05251575633883476,
0.015... |
devcenter | https://www.mongodb.com/developer/products/mongodb/leafsteroidsresources | created | # Leafsteroid Resources
Leafsteroids is a MongoDB Demo showing the following services and integrations
------------------------------------------------------------------------
**Atlas App Services**
All in one backend. Atlas App Services offers a full-blown REST service using Atlas Functions and HTTPS endpoints.
**Atlas Search**
Used to find the player nickname in the Web UI.
**Atlas Charts**
Event & personalized player dashboards accessible over the web. Built-in visualization right with your data. No additional tools required.
**Document Model**
Every game run is a single document demonstrating rich documents and “data that works together lives together”, while other data entities are simple collections (configuration).
**AWS Beanstalk** Hosts the Blazor Server Application (website).
**AWS EC2**
Used internally by AWS Beanstalk. Used to host our Python game server.
**AWS S3**
Used internally by AWS Beanstalk. | md | {
"tags": [
"MongoDB"
],
"pageDescription": "",
"contentType": "Tutorial"
} | Leafsteroid Resources | 2024-05-20T17:32:23.500Z | [
-0.026094362139701843,
-0.04921271279454231,
0.037323854863643646,
-0.0299040786921978,
0.0705348402261734,
-0.04460279271006584,
-0.02254570834338665,
0.027888087555766106,
-0.01923629269003868,
-0.0016965757822617888,
0.0184225644916296,
-0.071848064661026,
0.059057142585515976,
0.040072... |
devcenter | https://www.mongodb.com/developer/products/mongodb/leafsteroidsresources | created | **AWS S3**
Used internally by AWS Beanstalk.
**AWS Private Cloud**
Private VPN connection between AWS and MongoDB.
**At a MongoDB .local Event and want to register to play Leafsteroids? Register Here**
You can build & play Leafsteroids yourself with the following links
## Development Resources
|Resource| Link|
|---|---|
|Github Repo |Here|
|MongoDB TV Livestream
|Here|
|MongoDB & AWS |Here|
|MongoDB on the AWS Marketplace
|Here| | md | {
"tags": [
"MongoDB"
],
"pageDescription": "",
"contentType": "Tutorial"
} | Leafsteroid Resources | 2024-05-20T17:32:23.500Z | [
0.01239380706101656,
-0.07263623923063278,
0.0013125673867762089,
-0.026878532022237778,
0.07820390909910202,
-0.011997164227068424,
0.02472584694623947,
0.020452357828617096,
-0.005557980854064226,
0.03173254057765007,
0.034861087799072266,
-0.06892214715480804,
0.08139581233263016,
-0.00... |
devcenter | https://www.mongodb.com/developer/products/atlas/create-first-stream-processor | created | # Get Started with Atlas Stream Processing: Creating Your First Stream Processor
>Atlas Stream Processing is now available. Learn more about it here.
If you're not already familiar, Atlas Stream Processing enables processing high-velocity streams of complex data using the same data model and Query API that's used in MongoDB Atlas databases. Streaming data is increasingly critical to building responsive, event-driven experiences for your customers. Stream processing is a fundamental building block powering these applications, by helping to tame the firehouse of data coming from many sources, by finding important events in a stream, or by combining data in motion with data in rest.
In this tutorial, we will create a stream processor that uses sample data included in Atlas Stream Processing. By the end of the tutorial, you will have an operational Stream Processing Instance (SPI) configured with a stream processor. This environment can be used for further experimentation and Atlas Stream Processing tutorials in the future. | md | {
"tags": [
"Atlas"
],
"pageDescription": "Learn how to create a stream processor end-to-end using MongoDB Atlas Stream Processing.",
"contentType": "Tutorial"
} | Get Started with Atlas Stream Processing: Creating Your First Stream Processor | 2024-05-20T17:32:23.500Z | [
-0.03181539103388786,
-0.00400774460285902,
0.012517690658569336,
-0.03853747621178627,
0.010155958123505116,
-0.02912072464823723,
0.01552774291485548,
0.016902806237339973,
-0.011794387362897396,
-0.00721775321289897,
-0.0028351363725960255,
-0.04551303759217262,
-0.01961543597280979,
-0... |
devcenter | https://www.mongodb.com/developer/products/atlas/create-first-stream-processor | created | ### Tutorial Prerequisites
This is what you'll need to follow along:
* An Atlas user with atlasAdmin permission. For the purposes of this tutorial, we'll have the user "tutorialuser".
* MongoDB shell (Mongosh) version 2.0+
## Create the Stream Processing Instance
Let's first create a Stream Processing Instance (SPI). Think of an SPI as a logical grouping of one or more stream processors. When created, the SPI has a connection string similar to a typical MongoDB Atlas cluster.
Under the Services tab in the Atlas Project click, "Stream Processing". Then click the "Create Instance" button.
This will launch the Create Instance dialog.
Enter your desired cloud provider and region, and then click "Create". You will receive a confirmation dialog upon successful creation.
## Configure the connection registry | md | {
"tags": [
"Atlas"
],
"pageDescription": "Learn how to create a stream processor end-to-end using MongoDB Atlas Stream Processing.",
"contentType": "Tutorial"
} | Get Started with Atlas Stream Processing: Creating Your First Stream Processor | 2024-05-20T17:32:23.500Z | [
-0.03332658112049103,
-0.025232858955860138,
0.0059943185187876225,
-0.030978763476014137,
-0.0009012561058625579,
-0.03817266225814819,
0.009256967343389988,
0.00630947994068265,
-0.04340631887316704,
-0.017612531781196594,
-0.006332408636808395,
-0.08398216217756271,
-0.009786981157958508,... |
devcenter | https://www.mongodb.com/developer/products/atlas/create-first-stream-processor | created | ## Configure the connection registry
The connection registry stores connection information to the external data sources you wish to use within a stream processor. In this example, we will use a sample data generator that is available without any extra configuration, but typically you would connect to either Kafka or an Atlas database as a source.
To manage the connection registry, click on "Configure" to navigate to the configuration screen.
Once on the configuration screen, click on the "Connection Registry" tab.
Next, click on the "Add Connection" button. This will launch the Add Connection dialog.
From here, you can add connections to Kafka, other Atlas clusters within the project, or a sample stream. In this tutorial, we will use the Sample Stream connection. Click on "Sample Stream" and select "sample_stream_solar" from the list of available sample streams. Then, click "Add Connection". | md | {
"tags": [
"Atlas"
],
"pageDescription": "Learn how to create a stream processor end-to-end using MongoDB Atlas Stream Processing.",
"contentType": "Tutorial"
} | Get Started with Atlas Stream Processing: Creating Your First Stream Processor | 2024-05-20T17:32:23.500Z | [
-0.0217453446239233,
-0.023016583174467087,
0.024743005633354187,
-0.021385468542575836,
0.010694996453821659,
0.0028331701178103685,
0.006185399368405342,
0.029014887288212776,
-0.030587879940867424,
-0.024031855165958405,
0.0004896759055554867,
-0.055913813412189484,
0.01234001200646162,
... |
devcenter | https://www.mongodb.com/developer/products/atlas/create-first-stream-processor | created | The new "sample_stream_solar" will show up in the list of connections.
## Connect to the Stream Processing Instance (SPI)
Now that we have both created the SPI and configured the connection in the connection registry, we can create a stream processor. First, we need to connect to the SPI that we created previously. This can be done using the MongoDB Shell (mongosh).
To obtain the connection string to the SPI, return to the main Stream Processing page by clicking on the "Stream Processing" menu under the Services tab.
Next, locate the "Tutorial" SPI we just created and click on the "Connect" button. This will present a connection dialog similar to what is found when connecting to MongoDB Atlas clusters.
For connecting, we'll need to add a connection IP address and create a database user, if we haven't already. | md | {
"tags": [
"Atlas"
],
"pageDescription": "Learn how to create a stream processor end-to-end using MongoDB Atlas Stream Processing.",
"contentType": "Tutorial"
} | Get Started with Atlas Stream Processing: Creating Your First Stream Processor | 2024-05-20T17:32:23.500Z | [
-0.05499737337231636,
-0.01048199087381363,
0.008497720584273338,
0.0005277139716781676,
0.014967719092965126,
-0.03215320035815239,
0.04762919247150421,
0.0029842632357031107,
-0.0082230344414711,
-0.013334925286471844,
0.01503572054207325,
-0.06465968489646912,
-0.00460809888318181,
0.02... |
devcenter | https://www.mongodb.com/developer/products/atlas/create-first-stream-processor | created | For connecting, we'll need to add a connection IP address and create a database user, if we haven't already.
Then we'll choose our connection method. If you do not already have mongosh installed, install it using the instructions provided in the dialog.
Once mongosh is installed, copy the connection string from the "I have the MongoDB Shell installed" view and run it in your terminal.
```
Command Terminal > mongosh <> --tls --authenticationDatabase admin --username tutorialuser
Enter password: *******************
Current Mongosh Log ID: 64e9e3bf025581952de31587
Connecting to: mongodb://*****
Using MongoDB: 6.2.0
Using Mongosh: 2.0.0
For mongosh info see: https://docs.mongodb.com/mongodb-shell/
AtlasStreamProcessing> | md | {
"tags": [
"Atlas"
],
"pageDescription": "Learn how to create a stream processor end-to-end using MongoDB Atlas Stream Processing.",
"contentType": "Tutorial"
} | Get Started with Atlas Stream Processing: Creating Your First Stream Processor | 2024-05-20T17:32:23.500Z | [
0.0005484123830683529,
-0.02924477495253086,
-0.02949235588312149,
-0.02468080446124077,
-0.011256968602538109,
-0.06547355651855469,
0.02237183414399624,
0.016047267243266106,
-0.024771951138973236,
-0.03460979834198952,
0.0002459667739458382,
-0.05405601114034653,
0.024718735367059708,
0... |
devcenter | https://www.mongodb.com/developer/products/atlas/create-first-stream-processor | created | For mongosh info see: https://docs.mongodb.com/mongodb-shell/
AtlasStreamProcessing>
```
To confirm your sample_stream_solar is added as a connection, issue `sp.listConnections()`. Our connection to sample_stream_solar is shown as expected.
```
AtlasStreamProcessing> sp.listConnections()
{
ok: 1,
connections:
{
name: 'sample_stream_solar',
type: 'inmemory',
createdAt: ISODate("2023-08-26T18:42:48.357Z")
}
]
}
```
## Create a stream processor
If you are reading through this post as a prerequisite to another tutorial, you can return to that tutorial now to continue. | md | {
"tags": [
"Atlas"
],
"pageDescription": "Learn how to create a stream processor end-to-end using MongoDB Atlas Stream Processing.",
"contentType": "Tutorial"
} | Get Started with Atlas Stream Processing: Creating Your First Stream Processor | 2024-05-20T17:32:23.500Z | [
-0.04493936151266098,
-0.015181382186710835,
0.007576096337288618,
-0.003372494364157319,
0.0029510222375392914,
-0.04490417242050171,
0.03645902872085571,
-0.0029803963843733072,
-0.005037997383624315,
-0.012822633609175682,
0.000021968980945530348,
-0.07451292872428894,
-0.0189072936773300... |
devcenter | https://www.mongodb.com/developer/products/atlas/create-first-stream-processor | created | In this section, we will wrap up by creating a simple stream processor to process the sample_stream_solar source that we have used throughout this tutorial. This sample_stream_solar source represents the observed energy production of different devices (unique solar panels). Stream processing could be helpful in measuring characteristics such as panel efficiency or when replacement is required for a device that is no longer producing energy at all.
First, let's define a [$source stage to describe where Atlas Stream Processing will read the stream data from.
```
var solarstream={$source:{"connectionName": "sample_stream_solar"}}
```
Now we will issue .process to view the contents of the stream in the console.
`sp.process(solarstream])`
.process lets us sample our source data and quickly test the stages of a stream processor to ensure that it is set up as intended. A sample of this data is as follows: | md | {
"tags": [
"Atlas"
],
"pageDescription": "Learn how to create a stream processor end-to-end using MongoDB Atlas Stream Processing.",
"contentType": "Tutorial"
} | Get Started with Atlas Stream Processing: Creating Your First Stream Processor | 2024-05-20T17:32:23.500Z | [
-0.05555489659309387,
0.02191448211669922,
0.03463173285126686,
-0.011256235651671886,
0.01715617999434471,
-0.009063491597771645,
0.046450525522232056,
0.005418573971837759,
-0.024784475564956665,
0.0008564062300138175,
0.022811485454440117,
-0.0411125086247921,
-0.02638285979628563,
0.00... |
devcenter | https://www.mongodb.com/developer/products/atlas/create-first-stream-processor | created | ```
{
device_id: 'device_2',
group_id: 3,
timestamp: '2023-08-27T13:51:53.375+00:00',
max_watts: 250,
event_type: 0,
obs: {
watts: 168,
temp: 15
},
_ts: ISODate("2023-08-27T13:51:53.375Z"),
_stream_meta: {
sourceType: 'sampleData',
timestamp: ISODate("2023-08-27T13:51:53.375Z")
}
}
```
## Wrapping up | md | {
"tags": [
"Atlas"
],
"pageDescription": "Learn how to create a stream processor end-to-end using MongoDB Atlas Stream Processing.",
"contentType": "Tutorial"
} | Get Started with Atlas Stream Processing: Creating Your First Stream Processor | 2024-05-20T17:32:23.500Z | [
-0.008765537291765213,
0.010638762265443802,
0.04561035707592964,
-0.03054041974246502,
0.07004795223474503,
-0.05146593227982521,
-0.0014320466434583068,
-0.02007678709924221,
-0.027997657656669617,
0.015657952055335045,
0.0002828208089340478,
-0.10000167042016983,
0.01265680231153965,
-0... |
devcenter | https://www.mongodb.com/developer/products/atlas/create-first-stream-processor | created | In this tutorial, we started by introducing Atlas Stream Processing and why stream processing is a building block for powering modern applications. We then walked through the basics of creating a stream processor – we created a Stream Processing Instance, configured a source in our connection registry using sample solar data (included in Atlas Stream Processing), connected to a Stream Processing Instance, and finally tested our first stream processor using .process. You are now ready to explore Atlas Stream Processing and create your own stream processors, adding advanced functionality like windowing and validation.
If you enjoyed this tutorial and would like to learn more check out the [MongoDB Atlas Stream Processing announcement blog post. For more on stream processors in Atlas Stream Processing, visit our documentation.
### Learn more about MongoDB Atlas Stream Processing
For more on managing stream processors in Atlas Stream Processing, visit our documentation.
>Log in today to get started. Atlas Stream Processing is now available to all developers in Atlas. Give it a try today! | md | {
"tags": [
"Atlas"
],
"pageDescription": "Learn how to create a stream processor end-to-end using MongoDB Atlas Stream Processing.",
"contentType": "Tutorial"
} | Get Started with Atlas Stream Processing: Creating Your First Stream Processor | 2024-05-20T17:32:23.500Z | [
-0.05537842586636543,
-0.00036551570519804955,
0.018568983301520348,
-0.0229959636926651,
0.014584510587155819,
-0.023160990327596664,
0.02517032064497471,
0.007532201241701841,
-0.015971040353178978,
-0.021215766668319702,
0.011296478100121021,
-0.05792250484228134,
-0.00696675106883049,
... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.