.Net Fullstack Developer
Boeing IndiaProgrammer Analyst - .Net Developer
Boeing IndiaSystem Engineer
INFOSYSADO
GitLab
CI/CD
Azure DevOps
SonarQube
TFS
So brief introduction about myself is, uh, I have been working as a full stack developer, and my complete experience is in dot net. So I have also I've worked for variety of dotnet application, be it reporting application, be it database using, uh, MySQL or SQL Server. I have worked for web application. I work for desktop application. Apart from all that, I have also, uh, have I also have experience with server side programming. Uh, apart from developing the application, I also have experience in automation using, uh, Selenium and Appium. So I can develop, uh, automation tests. I also have experience for DevOps. I am responsible for creating CICD pipelines and also responsible for managing all the releases. Currently, I am working on a web application which has, uh, several microservices interacting with each other and, uh, a UI, uh, uh, which interacts with these microservices. So we have implemented a little bit of multi threading as well, and we are using entity framework. We have also used, uh, cloud for our deployment processes.
Tell my team got to my it's legacy.netalvision.netcore. And how would you handle data migration from MySQL to MSA SQL in this process. So migrating an application from dotnetto.netcore is a little bit difficult, but it is achievable. So first of all, we'll have to install the dotnetcore version in which we want to migrate it, uh, to let's say, the latest 1 is dotnet8. So when we are trying to migrate to dotnet, there's a lot of functions that have been replaced. The code wise, it will be very simple because both places the code would be in c sharp. So it is easier to modify the code. However, like, some, uh, functions have been modified from dotnetframework to dotnetcore. So we'll have to take care of which function to use and what are the replacement of those methods, which have which were existing already. And, also, migrating m s MySQL to MSSQL will, uh, will face problems with stored procedures and triggers if we have any in the database. The integrity, which means, like, the relationship between the retails would be same because both are our DBMS. So in that way, uh, it will be easier to perform the migration for the database.
With the potential risk of not following the solid principle when developing a new model. So if you're not following the solid principle, it is uh, very easy to introduce a bug in the code and which is very difficult to rectify and point out, like, where the bug is. So solid principles help us reduce code redundancy also, and it help us achieve and it help us to achieve scalability in the application. So, uh, starting with solid s, it's it is single responsibility principle. So it allows, like, a class should have only 1 and 1, uh, reason to change. So now there is open and close principle. So which means a class should be open for extension, but closed for migrate, uh, closed for modification. L is Liskov substitution principle. Lower level class should not depend on high level classes, and high level class should depend on, uh, sorry. High level class should not depend on low level classes. They should depend on abstraction. Next is I, which is, uh, in t interface segregation. So, uh, uh, it is similar to what we have for classes. The interface should, uh, also not be overloaded if it is responsible for different interface. Uh, it should be defined separately and then can be used in any class. These dependency injection, this helps us a lot, uh, because we are not creating the object every time using a new keyword. So wherever the service or the view model or anything is required, this can be directly injected and can be used throughout the application. So this is how we can mitigate.
Would you apply the repository pattern in dot net core obligation to start? How do you apply repository pattern in dotnetcoreapplication to abstract data access? Depository pattern. So the pattern that we follow is, um, MVC pattern. So we have the model. We have the view and the v we have the controller. So models are a representation of what we have in the database, or we can say, uh, uh, the repository. So models will be that, and then we have the view models. View models are what we are going to display to the, uh, in the view. So, like, not all properties of the models are required to be displayed in the view. Only few will be required. So that is why we are we'll be using view models, and we'll perform a 1 to 1 mapping if required. And so for establishing database connection, uh, we, uh, we can use we can use the program dot c s file, and we can have our DB connection details there. So it will be established only 1 time, and we can host the service and like, a provider. And then whoever wants to subscribe it can have it because it is already hosted. So and then we, uh, we will have our services managing the business logic. So let's say from the view model, we are getting the name, and then we are, uh, separating the first name and the last name and then giving it to the model to update the DB. So using the DB context. So that is how we'll have the service in between. And the controller will be responsible for managing the views, but it will not have business logic.
So multi tenant application. I haven't used multi tenant application. So in order to establish connection with all the database, uh, we have to have, like, different DB context, and we have to create the connection the program would see us accordingly. So dotnetco provides this facility to, uh, create connection with not only with, uh, MSSQL, but Oracle and MySQL also. So we can have different DB connections. And then, uh, like, let's say we have, uh, 1 table in Oracle database and the other table in the, uh, MS SQL. So we have to use different DB context, and then, uh, we can, like, populate the models of both these databases and then do the business interaction, uh, among them. So how to design it is, as I mentioned, using the program with CS file where you can service the DB connection.
Notion will take to leverage in this query to the frameworks caching capabilities. So, uh, using the caching is is very easy. So let's say when a person is trying to log in to the application, so it we will hit the DB only once and get the user details, and then we can cache the user details into, uh, in the cache. And we need not, uh, go to the DB every time and press the user details in order to perform any actions while he's on the system. So we can set expiry also, uh, for the this caching, like, after these many minutes or these many if the system is idle for this much time, we can, uh, remove the cache thing, and then the, uh, service goes back to normal. The other, uh, ways the other things where we can use caching is to, uh, store the JavaScript that that we have in it. So, uh, let's say if I'm uploading a new feature, we have to modify the JavaScript, and that is already cached in the web browser. So first, you have to clear the cache and then reload the page so you'll be able to see new changes also. So that is how we can make use of caching.
Given this JavaScript, Sam, suppose the console dot logout wasn't showing the expected result. Could you identify what? 4, the second. 6. So instead of doing considered log data, we have to, uh, first store the result of modified data into some variable. Let's say, let result is able to modify data and data, and then console dot doc, we have to, uh, print, uh, that result value.
Select name, max salary, and max salary from employees group by department. Counting. So, uh, in the group by, we'll have to have name, salary, and department because the number of the property that we are putting in select column, that property also have to be in group by as well. So that is the error. We'll not be able to get the get the proper results. So having count star greater than 5, we have to instead of star, we can have, like, count. I'm assuming it is departmentally, so count departmentally than 5.
What is the most efficient way to synchronize real time data between dotnetcoreapplication and connected clients using signalr and react front end in consistency with the back end MS SQL database. So to maintain consistency, what we follow is the master slave. It is 1 of the approach you can follow the master slave database design. So we can have the right operations onto the master database. And whenever you are trying to read, we can read it from the slave database. So it depends on, like, how, uh, what the what is the industry that we're trying to serve here. So if it is like a transact uh, Fintech company, and we want the transaction to be deflected as in when the operation is open. So we'll have to, uh, in the cloud, we have to make those setting that it is, uh, that it should be highly consistent. So it will do the right operation accordingly and update the slave database. So this is 1 of the efficient way.
Can you describe the process for setting up DevOps workflow in a dotnet company that includes automated testing and deployment to AWS Cloud deployment? So in order to set up the workflow in DevOps, first of all, you'll have 1st, what you'll have to do is you'll have to set up, uh, your code repository. So in order to set up the code repository, first, you have to establish the service connection. Let's say the code repository is in ADO. So we'll first have to get the service connection from there. To get the service connection, they provide something called token. Within that token and password, you will have to create a service connection in, uh, for your, uh, c I the CI or CD pipeline. So for testing and deployment, for testing, we what we can do is in a CI pipeline called this integration pipeline, we can add a step in order to run the, uh, in order to run the test, uh, test from our, uh, that build is in which it will generate. So there is a particular step to run the unit test. You can mention, like, what, uh, what should be the name of file, like, using a light like, uh, uh, like, using the light feature. You can put start, and then if it is ending with unit test, you can have that. Or if it is integration test, you can have it accordingly. For deployment, you will have to set up, uh, another release pipeline, which will deploy it to the cloud. So for deploying it to cloud, you'll have first establish the connection. So using the PowerShell, you can establish a connection using the username and password And then choosing the environment where you if, let's say, if you have, like, prod stage and dev all all in 1 place, so then you can choose which 1 you have to choose to deploy, which is known as the space on cloud. So after choosing the space, you'll have to perform CFB, which is CF push. So there is a PowerShell script code. You can use that also.
What do you recommend for securing the VPN points? So in order to transport the data from React application front end to the dot net core web API endpoints, Uh, there are a lot of ways for, uh, and several places that we can put. 1st is the we can have the API key, uh, sent from the, uh, application to the dotnetcorewebvpa, which we're trying to access. Authenticate it, and we can authenticate it on the, uh, API side. So that is 1 way. In order to implement this API key auth, we can have our own attribute added. In order to have that attribute executed, we'll have to implement I async action filter, which takes 2 argument, which is the action indicate and action context. So you can write your logic there, and then at the end, you'll have to put await dot next. So next, uh, action can be executed. So action is our basically control method where we will handle, uh, the request which is made by the client side. Uh, apart from that, we, uh, we can use JWT tokens. So JWT tokens, what they'll do is they they follow-up public and private key. So public key will be given to the client, and private key is already stored on the server side. So, uh, what we do with JWT is, uh, we train we try to encrypt the data before sending it and over the network. So encrypting the data, serializing it, and then sending it over the network. Then on the server side, we can decrypt it the data, uh, for this from the JWT to using the JWT token private key, and we can have, uh, both, uh, different levels of security.