upgraded UI

This commit is contained in:
Saifeddine ALOUI 2024-09-29 10:17:55 +02:00
parent bf99ff55b2
commit 6cfe277a3c
19 changed files with 404 additions and 386 deletions

View File

@ -19,4 +19,7 @@ The models we have today are not concious, they are just function calls. They do
At some point, we need to forbid those things from starting to think on their own. But projects like autogpt and the langchain are giving more control to the AI. Still, the human is in control, but he is less and less in control. At least for now, bad things still come from humans and not AI by itself.
But who knows?
But who knows?
By ParisNeo
2022

View File

@ -14,4 +14,8 @@ Phylosopher 1: I agree with your point about focusing on the development of ethi
Child: But can you really stop the development of AI if people want it so much for different purposes like helping disabled or improving productivity at workplaces. It seems to me, humans are going to develop this technology no matter what because they see benefits in using it and we should just try our best to regulate its usage
Phylosopher 2: I completely agree with you on that point! Humans will always find ways to improve themselves or their surroundings through technological advancements, so trying to ban AI outright is simply not realistic. However, what we can do as a society is try and slow down its development by raising awareness about the potential dangers it could bring while also focusing on creating ethical guidelines for its usage
Child: So should i be scared of ai or excited? :D
Phylosopher 1 & Phylosopher 2 in unison: Both! It all depends on how we use and regulate AI's development
Phylosopher 1 & Phylosopher 2 in unison: Both! It all depends on how we use and regulate AI's development
By ParisNeo
2022

View File

@ -35,4 +35,5 @@ This optimistic vision reminds us that technology, including AI, can be a powerf
In conclusion, the moment I witnessed on the tramway serves as a microcosm of a larger existential dilemma. As we continue to integrate AI into our lives, we must remain vigilant about preserving our cognitive and critical thinking abilities. After all, the essence of being human lies not in our reliance on tools, but in our capacity to think, adapt, and innovate. And with the right approach, we can harness the power of AI to elevate our civilization to new heights, much like the hopeful future envisioned in “Star Trek.”
ParisNeo
By ParisNeo
2022

View File

@ -21,4 +21,7 @@ The multitude of opensource models is a good thing. With enough luck, poison mod
And for non certified models, maybe build a pool of judgement models that are trust worthy, we give them the same prompts as the models to test and look at the statistics of their answers and check the ones that are out of concensus. Just like we do with science.
I hope we figure out a way. I'm sure this community is full of smart people who can pull it off.
I hope we figure out a way. I'm sure this community is full of smart people who can pull it off.
By ParisNeo
2023

View File

@ -53,3 +53,7 @@ Ultimately, the ethical implications of AI depend on us. We must determine wheth
## Author
ParisNeo with Help from AI (conditionned for essai writing) to shape the text.
By ParisNeo
2022

View File

@ -1,5 +1,5 @@
# =================== Lord Of Large Language Multimodal Systems Configuration file ===========================
version: 138
version: 139
binding_name: null
model_name: null
model_variant: null
@ -105,7 +105,7 @@ active_tts_service: "None" # xtts (offline), openai_tts (API key required), elev
active_tti_service: "None" # autosd (offline), diffusers (offline), diffusers_client (online), dall-e (online), midjourney (online)
active_stt_service: "None" # whisper (offline), asr (offline or online), openai_whiosper (API key required)
active_ttm_service: "None" # musicgen (offline)
active_ttv_service: "None" # cog_video_x (offline)
active_ttv_service: "None" # cog_video_x, diffusers, lumalab (offline)
# -------------------- Services --------------------------
# ***************** STT *****************
@ -217,6 +217,9 @@ motion_ctrl_base_url: http://localhost:7861
# ***************** TTV *****************
cog_video_x_model: "THUDM/CogVideoX-5b"
# lumalabs configuration
lumalabs_key: ""
# ***************** TTT *****************
# ollama service

View File

@ -1,29 +1,34 @@
Certainly! I'll combine all the information into a comprehensive overview of lollms, including its history, features, and significance. Here's the full, consolidated version:
# lollms: The Ever-Evolving AI Ecosystem That Rules Them All
## The Visionary Behind lollms
lollms, short for "Lord of Large Language & Multimodal Systems," is the brainchild of ParisNeo, a visionary expert in AI and robotics. ParisNeo's journey in coding began at the tender age of 11, igniting a lifelong passion that has never waned. As the creator of lollms, ParisNeo's dream was to develop an accessible, free-of-charge tool that could "rule them all" - a sentiment that echoes throughout the project's philosophy.
lollms, short for "Lord of Large Language & Multimodal Systems," is the brainchild of ParisNeo, a visionary expert in AI and robotics. ParisNeo's journey in coding began at the tender age of 11, igniting a lifelong passion that has never waned. As the creator of lollms, ParisNeo's dream was to develop an accessible, free-of-charge tool that could "rule them all" - a sentiment that echoes throughout the project's philosophy. Today ParisNeo, holds two Engineering degrees and a Ph.D. His passion lies in exploring the applications of Artificial Intelligence across various fields. ParisNeo's work focuses on practical implementations of AI technology, aiming to solve real-world problems and advance the field of machine learning and natural language processing. As a computer geek with a strong academic background, ParisNeo is very active in the open-source community, contributing significantly to AI projects and tools. His commitment to open-source development reflects a belief in collaborative innovation and the democratization of AI technology. Through these efforts, ParisNeo continues to make impactful contributions to the AI community, fostering the growth and accessibility of cutting-edge AI solutions.
## The Evolution of lollms: From Google chrome Plugin to Powerhouse
The journey of lollms is a testament to rapid innovation and adaptability in the fast-paced world of AI development:
### Early Beginnings: The ChatGPT chrome Plugin names chatgpt personality selector
- ParisNeo's foray into AI tools began with a chrome plugin for ChatGPT called chatgpt personality selector, it adds buttons to the Chatgpt interface to condition the AI to be any personality out of the list that he developed.
### Early Beginnings: The chrome Plugin named "chatgpt personality selector"
- ParisNeo's foray into AI tools began with a chrome plugin called chatgpt personality selector, it adds buttons to the Chatgpt interface to condition the AI to be any personality out of the list that he developed.
- Developed just months after ChatGPT's release
- Demonstrated ParisNeo's quick recognition of AI's potential and his ability to build upon emerging technologies
### The Standalone Application: GPT4All WebUI
- Inspired by the release of LLaMA and early versions of GPT4All
- ParisNeo saw the need for a more versatile, standalone application
- Initially named "GPT4All WebUI," reflecting its origins and primary interface
### The Standalone Application: GPT4All WebUI as a ui to interact with GPT4All model
GPT4all is a model not built by ParisNeo. it is built by a company named Nomic AI. But ParisNeo built a ui that ccan interface this model.
- 1st of april 2023, was the date ParisNeo moved from his chrome plugin, to a standalone app.
- ParisNeo wanted to build a webui for a local AI instead of continuing to hijack the chatgpt page.
- Inspired by the release of LLaMA and early versions of GPT4All (a model built by Nomic AI that was one of the first models trained on llama architecture)
- ParisNeo built a web user interface named: GPT4AL Webui which is a webui for the GPT4All model (not to be confused with their own GPT4All application built using QT)
### The Birth of lollms
- Parisneo felt the need to broaden the content of his tool and did not want people to get confused between his tool and the GPT4All tool, so he split from using GPT4All and instead changed the name to something way more ambitious and continued building it mainly alone.
- At the beginning, he got some help in the UI development from few experts but then he went really solo mode
- As the project grew in scope and capability, a new name was needed
- Renamed to "lollms - Lord of Large Language Models"
- The new name reflected the tool's expanding compatibility with various AI systems
- One tool to rule them all.
- Emphasized the project's ambition to be a comprehensive solution for language models
### Embracing Multimodality
@ -110,6 +115,10 @@ ParisNeo's commitment to sharing lollms with the community goes beyond just open
- Provides a platform for exploring AI's potential to benefit society
- Demonstrates how passion-driven projects can have a significant impact in the tech world
## Security
Lollms was spot by huntr.com which is a bounty hunting platform. People get payed to find vulenerabilities in the system and ParisNeo gets notified to upgrade his system and fix the vulenerabilities. This leads to a better more secure tool.
## The Future of lollms: A Dream Realized
As lollms continues to evolve and expand its ecosystem, it raises intriguing questions about the future of AI:
@ -127,3 +136,15 @@ For podcasters, tech enthusiasts, researchers, and anyone interested in the futu
Whether lollms leads us to utopia or raises important questions about potential dystopian futures, one thing is certain: it's a name we'll be hearing - and pronouncing - a lot more in the years to come. It represents not just the culmination of a childhood dream, but a powerful vision for the future of AI that has the potential to shape our world in profound and exciting ways.
# Lollms age
Lollms has now one year and a half of existance. From humble beggingings to the tool that rules them all.
# Current development status
ParisNeo is coding lollms in his free time, nights, weekends, holidays. Building stuff he likes. Just taking it as a hobby.
# Problems that needs to be solved
Lollms is built by a single person. Although some persons did contribute, it is mainly a one man show, which has its own limits as PerisNeo is not getting enough help.
The install of lollms is still challenging as it is packing loads of uncompatible libraries and tools, and making them all work together is a huge work that ParisNeo is doing. But when a library upgrades, many times dependancies break, so ParisNeo has to spend a lot of time to fix all that. Which is exhausting sometimes.

View File

@ -1,14 +1,14 @@
Title: "LoLLMs Smart Router: Optimizing AI Performance and Efficiency"
Title: "lollms Smart Router: Optimizing AI Performance and Efficiency"
[Intro animation with LoLLMs logo]
[Intro animation with lollms logo]
[Upbeat background music]
Host: "Welcome to LoLLMs, where we're revolutionizing the world of AI with our smart router technology. Today, we're diving into how our innovative system optimizes AI model selection for maximum efficiency and cost-effectiveness."
Host: "Welcome to lollms, where we're revolutionizing the world of AI with our smart router technology. Today, we're diving into how our innovative system optimizes AI model selection for maximum efficiency and cost-effectiveness."
[Animated diagram showing the LoLLMs smart router concept]
[Animated diagram showing the lollms smart router concept]
Host: "The LoLLMs smart router is a game-changer in AI task management. Here's how it works:"
Host: "The lollms smart router is a game-changer in AI task management. Here's how it works:"
Activation:
[Animation of a user turning on the smart router]
@ -40,9 +40,9 @@ Faster response times
Optimal use of AI resources
[Closing animation]
Host: "With LoLLMs smart router, we're not just making AI smarter we're making it smarter about how it uses its intelligence. One tool to rule them all, indeed!"
Host: "With lollms smart router, we're not just making AI smarter we're making it smarter about how it uses its intelligence. One tool to rule them all, indeed!"
[Call to action]
"Want to learn more about how LoLLMs can revolutionize your AI operations? Visit our website or contact us today for a demo."
"Want to learn more about how lollms can revolutionize your AI operations? Visit our website or contact us today for a demo."
[LoLLMs logo and contact information]
[lollms logo and contact information]

View File

@ -0,0 +1,2 @@
# Extra info
The current podcast was steered by lollms as the two personas who are talking are actually AI agents presenting a lollms description, but they are themselves controlled by lollms. Make a surprise emotion when they both understand that they are actually the product of AI generation.

View File

@ -38,8 +38,6 @@
// <script src="https://cdnjs.cloudflare.com/ajax/libs/viz.js/2.1.2/full.render.js"></script>
// Don't forget to get the css too <link rel="stylesheet" href="/lollms_assets/css/lollms_markdown_renderer">
// Make sure there is a global variable called mr that instanciate MarkdownRenderer
// mr = new MarkdownRenderer()
class MarkdownRenderer {
@ -953,3 +951,5 @@ class MarkdownRenderer {
.replace(/'/g, "&#039;");
}
}
// Make sure there is a global variable called mr that instanciate MarkdownRenderer
mr = new MarkdownRenderer()

@ -1 +1 @@
Subproject commit a33efccd46cdd6564df59e1fedd7db36f350c33d
Subproject commit 592a3a5ccf7b76cfbdd53213be770c3a4a461a12

View File

@ -814,6 +814,9 @@ class LOLLMSWebUI(LOLLMSElfServer):
'started_generating_at': client.discussion.current_message.started_generating_at,
'finished_generating_at': client.discussion.current_message.finished_generating_at,
'nb_tokens': client.discussion.current_message.nb_tokens,
'binding': self.config["binding_name"],
'model' : self.config["model_name"],
'personality': self.config["personalities"][self.config["active_personality_id"]],
}, to=client_id
)
)
@ -849,7 +852,11 @@ class LOLLMSWebUI(LOLLMSElfServer):
'finished_generating_at': client.discussion.current_message.finished_generating_at,
'nb_tokens': client.discussion.current_message.nb_tokens,
'parameters':parameters,
'metadata':metadata
'metadata':metadata,
'binding': self.config["binding_name"],
'model' : self.config["model_name"],
'personality': self.config["personalities"][self.config["active_personality_id"]],
}, to=client_id
)
)
@ -879,6 +886,10 @@ class LOLLMSWebUI(LOLLMSElfServer):
'started_generating_at': client.discussion.current_message.started_generating_at,
'finished_generating_at': client.discussion.current_message.finished_generating_at,
'nb_tokens': client.discussion.current_message.nb_tokens,
'binding': self.config["binding_name"],
'model' : self.config["model_name"],
'personality': self.config["personalities"][self.config["active_personality_id"]],
}, to=client_id
)
)

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

8
web/dist/assets/index-c71b7e0a.css vendored Normal file

File diff suppressed because one or more lines are too long

4
web/dist/index.html vendored
View File

@ -6,8 +6,8 @@
<script src="https://cdn.jsdelivr.net/npm/mathjax@3/es5/tex-svg.js"></script>
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>LoLLMS WebUI</title>
<script type="module" crossorigin src="/assets/index-4ce3ce46.js"></script>
<link rel="stylesheet" href="/assets/index-86a84335.css">
<script type="module" crossorigin src="/assets/index-503e6468.js"></script>
<link rel="stylesheet" href="/assets/index-c71b7e0a.css">
</head>
<body>
<div id="app"></div>

View File

@ -159,39 +159,51 @@
</div>
</div>
<div class="w-fit group relative" v-if="!loading" >
<div class= "hide top-50 hide opacity-0 group-hover:bottom-0 opacity-0 .group-hover:block fixed w-[1000px] group absolute group-hover:opacity-100 transform group-hover:translate-y-[-50px] group-hover:translate-x-[0px] transition-all duration-300">
<div class="w-fit group relative" v-if="!loading">
<div class="hide top-50 hide opacity-0 group-hover:bottom-0 opacity-0 .group-hover:block fixed w-[1000px] group absolute group-hover:opacity-100 transform group-hover:translate-y-[-50px] group-hover:translate-x-[0px] transition-all duration-300">
<div class="w-fit flex-wrap flex bg-white bg-opacity-50 backdrop-blur-md rounded p-4">
<div class="w-fit h-fit"
v-for="(item, index) in installedModels" :key="index + '-' + item.name"
ref="installedModels"
@mouseover="showModelHoveredIn(index)"
@mouseleave="showModelHoveredOut()"
<div class="w-fit h-fit"
v-for="(item, index) in installedModels" :key="index + '-' + item.name"
ref="installedModels"
@mouseover="showModelHoveredIn(index)"
@mouseleave="showModelHoveredOut()"
>
<div v-if="index!=model_name" class="items-center flex flex-row relative z-20 hover:-translate-y-8 duration-300"
:class="modelHoveredIndex === index?'scale-150':''"
>
<div class="relative">
<button @click.prevent="setModel(item)" class="w-10 h-10 relative">
<img :src="item.icon?item.icon:modelImgPlaceholder" @error="personalityImgPlacehodler"
class="z-50 w-10 h-10 rounded-full object-fill text-red-700 border-2 border-gray-500 active:scale-90"
:class="modelHoveredIndex === index?'scale-150 ':'' + item.name==model_name ? 'border-secondary' : 'border-transparent z-0'"
:title="item.name">
</button>
<div v-if="index!=model_name" class="items-center flex flex-row relative z-20 hover:-translate-y-8 duration-300"
:class="modelHoveredIndex === index ? 'scale-150' : ''"
>
<div class="relative flex items-center">
<!-- Parent container for both buttons -->
<div class="relative group">
<button @click.prevent="setModel(item)" class="w-10 h-10 relative">
<img :src="item.icon ? item.icon : modelImgPlaceholder" @error="personalityImgPlacehodler"
class="z-50 w-10 h-10 rounded-full object-fill text-red-700 border-2 border-gray-500 active:scale-90"
:class="modelHoveredIndex === index ? 'scale-150' : '' + item.name == model_name ? 'border-secondary' : 'border-transparent z-0'"
:title="item.name">
</button>
<!-- New copy button with SVG icon that appears on hover -->
<button v-if="modelHoveredIndex === index" @click.prevent="copyModelNameFrom(item.name)"
class="absolute -top-2 -right-2 bg-blue-500 text-white p-1 rounded-full hover:bg-blue-700 transition duration-300">
<svg xmlns="http://www.w3.org/2000/svg" class="h-3 w-3" viewBox="0 0 20 20" fill="currentColor">
<path d="M8 3a1 1 0 011-1h2a1 1 0 110 2H9a1 1 0 01-1-1z" />
<path d="M6 3a2 2 0 00-2 2v11a2 2 0 002 2h8a2 2 0 002-2V5a2 2 0 00-2-2 3 3 0 01-3 3H9a3 3 0 01-3-3z" />
</svg>
</button>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
<div class="group items-center flex flex-row">
<button @click.prevent="copyModelName()" class="w-8 h-8">
<img :src="currentModelIcon"
class="w-8 h-8 rounded-full object-fill text-red-700 border-2 active:scale-90 hover:border-secondary hover:scale-110 hover:-translate-y-1 duration-400"
:title="currentModel?currentModel.name:'unknown'">
:title="currentModel ? currentModel.name : 'unknown'">
</button>
</div>
</div>
</div>
<div class="w-fit group relative" v-if="!loading">
<!-- :onShowPersList="onShowPersListFun" -->
<div class= "top-50 hide opacity-0 group-hover:bottom-0 .group-hover:block fixed w-[1000px] group absolute group-hover:opacity-100 transform group-hover:translate-y-[-50px] group-hover:translate-x-[0px] transition-all duration-300">
@ -642,6 +654,10 @@ export default {
navigator.clipboard.writeText(this.binding_name + "::" + this.model_name);
this.$store.state.toast.showToast("Model name copyed to clipboard: "+this.binding_name + "::" + this.model_name, 4, true)
},
copyModelNameFrom(model){
navigator.clipboard.writeText(this.binding_name + "::" + model);
this.$store.state.toast.showToast("Model name copyed to clipboard: "+this.binding_name + "::" + this.model_name, 4, true)
},
showModelConfig(){
try {
this.isLoading = true

View File

@ -1,55 +1,55 @@
<template>
<transition name="fade-and-fly">
<div v-if="!isReady" class="fixed top-0 left-0 w-screen h-screen flex items-center justify-center bg-gradient-to-br from-blue-100 to-purple-100 dark:from-blue-900 dark:to-purple-900 overflow-hidden">
<!-- Falling strawberries -->
<div class="absolute inset-0 pointer-events-none overflow-hidden">
<div v-for="n in 50" :key="n" class="absolute animate-fall animate-giggle"
:style="{
left: `${Math.random() * 100}%`,
top: `-20px`,
animationDuration: `${3 + Math.random() * 7}s`,
animationDelay: `${Math.random() * 5}s`
}">
🪶
<transition name="fade-and-fly">
<div v-if="!isReady" class="fixed top-0 left-0 w-screen h-screen flex items-center justify-center bg-gradient-to-br from-blue-100 to-purple-100 dark:from-blue-900 dark:to-purple-900 overflow-hidden">
<!-- Falling strawberries -->
<div class="absolute inset-0 pointer-events-none overflow-hidden">
<div v-for="n in 50" :key="n" class="absolute animate-fall animate-giggle"
:style="{
left: `${Math.random() * 100}%`,
top: `-20px`,
animationDuration: `${3 + Math.random() * 7}s`,
animationDelay: `${Math.random() * 5}s`
}">
🪶
</div>
</div>
</div>
<div class="flex flex-col items-center text-center max-w-4xl w-full px-4 relative z-10">
<div class="mb-8 w-full">
<div class="text-6xl md:text-7xl font-bold text-amber-500 mb-2"
style="text-shadow: 2px 2px 0px white, -2px -2px 0px white, 2px -2px 0px white, -2px 2px 0px white;">
L🪶LLMS
<div class="flex flex-col items-center text-center max-w-4xl w-full px-4 relative z-10">
<div class="mb-8 w-full">
<div class="text-6xl md:text-7xl font-bold text-amber-500 mb-2"
style="text-shadow: 2px 2px 0px white, -2px -2px 0px white, 2px -2px 0px white, -2px 2px 0px white;">
L🪶LLMS
</div>
<p class="text-2xl text-gray-600 dark:text-gray-300 italic">
One tool to rule them all
</p>
<p class="text-xl text-gray-500 dark:text-gray-400 mb-6">
by ParisNeo
</p>
<p class="bottom-0 text-2xl text-gray-600 dark:text-gray-300 italic">
{{ version_info }}
</p>
<div class="w-full h-24 relative overflow-hidden bg-gradient-to-r from-blue-200 to-purple-200 dark:from-blue-800 dark:to-purple-800 rounded-full shadow-lg flex items-center justify-center">
<p style="font-size: 48px; line-height: 1;">🪶</p>
</div>
<p class="text-2xl text-gray-600 dark:text-gray-300 italic">
One tool to rule them all
</p>
<p class="text-xl text-gray-500 dark:text-gray-400 mb-6">
by ParisNeo
</p>
<p class="bottom-0 text-2xl text-gray-600 dark:text-gray-300 italic">
{{ version_info }}
</p>
<div class="w-full h-24 relative overflow-hidden bg-gradient-to-r from-blue-200 to-purple-200 dark:from-blue-800 dark:to-purple-800 rounded-full shadow-lg flex items-center justify-center">
<p style="font-size: 48px; line-height: 1;">🪶</p>
</div>
</div>
<div class="w-full max-w-2xl">
<div role="status" class="w-full">
<p class="text-xl text-gray-700 dark:text-gray-300">
{{ loading_infos }}...
</p>
<p class="text-2xl font-bold text-blue-600 dark:text-blue-400 mt-2">
{{ Math.round(loading_progress) }}%
</p>
</div>
</div>
<div class="w-full max-w-2xl">
<div role="status" class="w-full">
<p class="text-xl text-gray-700 dark:text-gray-300">
{{ loading_infos }}...
</p>
<p class="text-2xl font-bold text-blue-600 dark:text-blue-400 mt-2">
{{ Math.round(loading_progress) }}%
</p>
</div>
</div>
</div>
</div>
</div>
</transition>
</transition>

View File

@ -1851,6 +1851,8 @@
>
<option value="None">None</option>
<option value="cog_video_x">Cog Video X</option>
<option value="diffusers">Diffusers</option>
<option value="lumalab">Lumalab</option>
</select>
</td>
</tr>
@ -3552,6 +3554,28 @@
</table>
</Card>
</Card>
<Card title="TTV settings" :is_subcard="true" class="pb-2 m-2">
<table class="bg-gray-50 border border-gray-300 text-gray-900 text-sm rounded-lg focus:ring-blue-500 focus:border-blue-500 block w-full p-2.5 dark:bg-gray-700 dark:border-gray-600 dark:placeholder-gray-400 dark:text-white dark:focus:ring-blue-500 dark:focus:border-blue-500">
<tr>
<td style="min-width: 200px;">
<label for="lumalabs_key" class="text-sm font-bold" style="margin-right: 1rem;">Lumalabs key:</label>
</td>
<td>
<div class="flex flex-row">
<input
type="text"
id="lumalabs_key"
required
v-model="configFile.lumalabs_key"
@change="settingsChanged=true"
class="mt-1 px-2 py-1 border border-gray-300 rounded dark:bg-gray-600"
>
</div>
</td>
</tr>
</table>
</Card>
<Card title="Misc" :is_shrunk="true" :is_subcard="true" class="pb-2 m-2">
<Card title="Elastic search Service (under construction)" :is_subcard="true" class="pb-2 m-2">
<table class="bg-gray-50 border border-gray-300 text-gray-900 text-sm rounded-lg focus:ring-blue-500 focus:border-blue-500 block w-full p-2.5 dark:bg-gray-700 dark:border-gray-600 dark:placeholder-gray-400 dark:text-white dark:focus:ring-blue-500 dark:focus:border-blue-500">
@ -7084,14 +7108,14 @@ export default {
async beforeRouteLeave(to) {
// console.log('did settings?',this.settingsChanged)
await this.$router.isReady()
if (this.settingsChanged) {
const res = await this.$store.state.yesNoDialog.askQuestion("Did You forget to apply changes?\nYou need to apply changes before you leave, or else.", 'Apply configuration', 'Cancel')
if (res) {
this.applyConfiguration()
// if (this.settingsChanged) {
// const res = await this.$store.state.yesNoDialog.askQuestion("Did You forget to apply changes?\nYou need to apply changes before you leave, or else.", 'Apply configuration', 'Cancel')
// if (res) {
// this.applyConfiguration()
}
return false
}
// }
// return false
// }
},