-
-
Notifications
You must be signed in to change notification settings - Fork 5.4k
529 UI update llm embedder and vectordb selection pages #533
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
529 UI update llm embedder and vectordb selection pages #533
Conversation
…d vectordb settings pages
…29-ui-update-llm-embedder-and-vectordb-selection-pages
frontend/src/pages/OnboardingFlow/Steps/EmbeddingPreference/index.jsxConsider using useMemo to memoize the result of the filtering operation. This will prevent unnecessary re-computations when the dependencies haven't changed, thus improving the performance of the code. const filteredEmbedders = useMemo(() => {
const selectedEmbedderItem = EMBEDDERS.find(
(embedder) => embedder.value === selectedEmbedder
);
let filtered = EMBEDDERS.filter((embedder) =>
embedder.name.toLowerCase().includes(searchQuery.toLowerCase())
);
if (selectedEmbedderItem) {
filtered = [
selectedEmbedderItem,
...filtered.filter((embedder) => embedder.value !== selectedEmbedder),
];
}
return filtered;
}, [searchQuery, selectedEmbedder]);frontend/src/pages/OnboardingFlow/Steps/VectorDatabaseConnection/index.jsxConsider using useMemo to memoize the result of the filtering operation. This will prevent unnecessary re-computations when the dependencies haven't changed, thus improving the performance of the code. const filteredVDBs = useMemo(() => {
const selectedVDBItem = VECTOR_DBS.find((vdb) => vdb.value === selectedVDB);
let filtered = VECTOR_DBS.filter((vdb) =>
vdb.name.toLowerCase().includes(searchQuery.toLowerCase())
);
if (selectedVDBItem) {
filtered = [
selectedVDBItem,
...filtered.filter((vdb) => vdb.value !== selectedVDB),
];
}
return filtered;
}, [searchQuery, selectedVDB]);frontend/src/pages/GeneralSettings/EmbeddingPreference/index.jsxConsider using a memoized version of the filteredEmbedders array. This can improve performance by avoiding unnecessary re-renders. // Instead of this:
const [filteredEmbedders, setFilteredEmbedders] = useState([]);
// You could do this:
const filteredEmbedders = useMemo(() => {
const selectedEmbedderItem = EMBEDDERS.find(
(embedder) => embedder.value === selectedEmbedder
);
let filtered = EMBEDDERS.filter((embedder) =>
embedder.name.toLowerCase().includes(searchQuery.toLowerCase())
);
if (selectedEmbedderItem) {
filtered = [
selectedEmbedderItem,
...filtered.filter((embedder) => embedder.value !== selectedEmbedder),
];
}
return filtered;
}, [searchQuery, selectedEmbedder]);frontend/src/pages/GeneralSettings/LLMPreference/index.jsxThe useEffect hook that filters the LLMs is triggered every time the searchQuery or selectedLLM changes. This could lead to unnecessary re-renders and performance issues if the list of LLMs is large. Consider using a memoized version of the filteredLLMs to avoid unnecessary computations. const filteredLLMs = useMemo(() => {
const selectedLLMItem = LLMS.find((llm) => llm.value === selectedLLM);
let filtered = LLMS.filter((llm) =>
llm.name.toLowerCase().includes(searchQuery.toLowerCase())
);
// If LLM selected, move it to the top
if (selectedLLMItem) {
filtered = [
selectedLLMItem,
...filtered.filter((llm) => llm.value !== selectedLLM),
];
}
return filtered;
}, [searchQuery, selectedLLM]);The handleSubmit function is declared inside the component which means it will be recreated every time the component re-renders. This could lead to unnecessary re-renders and performance issues. Consider using the useCallback hook to memoize the function. const handleSubmit = useCallback(async (e) => {
e.preventDefault();
const form = e.target;
const data = {};
const formData = new FormData(form);
data.LLMProvider = selectedLLM;
for (var [key, value] of formData.entries()) data[key] = value;
const { error } = await System.updateSystem(data);
setSaving(true);
if (error) {
showToast(`Failed to save LLM settings: ${error}`, "error");
} else {
showToast("LLM preferences saved successfully.", "success");
}
setSaving(false);
setHasChanges(!!error);
}, [selectedLLM]);frontend/src/pages/GeneralSettings/VectorDatabase/index.jsxConsider using a memoized version of the filteredVDBs to avoid unnecessary re-renders. This can be achieved by using the useMemo hook from React. This will ensure that the filteredVDBs are only recalculated when searchQuery or selectedVDB changes. const filteredVDBs = useMemo(() => {
const selectedVDBItem = VECTOR_DBS.find((vdb) => vdb.value === selectedVDB);
let filtered = VECTOR_DBS.filter((vdb) =>
vdb.name.toLowerCase().includes(searchQuery.toLowerCase())
);
if (selectedVDBItem) {
filtered = [
selectedVDBItem,
...filtered.filter((vdb) => vdb.value !== selectedVDB),
];
}
return filtered;
}, [searchQuery, selectedVDB]); |
frontend/src/pages/OnboardingFlow/Steps/VectorDatabaseConnection/index.jsx
Show resolved
Hide resolved
…' of github.com:Mintplex-Labs/anything-llm into 529-ui-update-llm-embedder-and-vectordb-selection-pages
…s#533) * move llm, embedder, vectordb items to components folder * add backdrop blur to search in llm, embedder, vectordb preferences * implement searchable llm preference in settings * implement searchable embedder in settings * remove unused useState from embedder preferences * implement searchable vector database in settings * fix save changes button not appearing on change for llm, embedder, and vectordb settings pages * sort selected items in all settings and put selected item at top of list * no auto-top for selection --------- Co-authored-by: timothycarambat <rambat1010@gmail.com>
resolves #529