Get the Latest News with our Knime Workflow

 

In this post, we’ll explore how we built a Knime project to fetch the latest news from top media outlets using their RSS feeds. This workflow allows us to collect real-time news updates and present them in a user-friendly interface.


Click here to Download this project from Official Knime Page

Connecting to RSS Feeds

The first step of our workflow involves connecting to the RSS feeds of major media outlets. We achieve this by using Knime's RSS Feed Reader node, which pulls in live updates from multiple sources.

Once we’ve gathered the news, the workflow concatenates the results into a single table, providing a unified view of the news headlines. This is then passed to a dashboard component for further interaction and visualization.

Creating an Interactive Dashboard

Inside the dashboard component, we present the news data in two key ways. First, the results are shown in a table with clickable hyperlinks, allowing users to quickly access the full news articles. Additionally, we include a slicer based on media outlets, enabling users to filter the news by source.

Visualizing Word Frequencies with a Word Cloud

In the second part of the dashboard, we introduce a word cloud that highlights the most common terms across the news articles. Before generating the word cloud, the workflow processes the data to clean up irrelevant words and structures it into a "bag of words." We then calculate term frequencies, visualizing the most important or trending keywords in the news.

This not only helps users quickly identify popular topics but also adds an engaging visual element to the dashboard.

One-Click Access to the Latest News

The final result is an intuitive, streamlined interface where you can instantly get the latest headlines from top outlets with just a single click. With Knime's browser capabilities, you can also read the articles directly within the platform, eliminating the need to switch between applications.





Conclusion

With this Knime workflow, staying updated on breaking news has never been easier. The combination of real-time data fetching, interactive filtering, and keyword visualization offers a powerful tool for anyone needing instant access to the latest media coverage

If you enjoyed this project, please share it and leave a comment below. Don’t forget to explore our Knime segment for more exciting workflows and automation tools!

Share:

Scrape your Competitor's Websites with Advanced Web Scraper

In this post, we will explore the details of our latest project: the Advanced Web Scraper specifically designed for H&M Germany using the Knime Workflow. This powerful tool allows users to connect directly to the H&M Germany website, effectively gathering vital information such as product categories, sub-categories, product page URLs, and price data. By organizing this data into specific hierarchies, businesses can gain valuable insights into their competitive landscape.




Click here to Download this workflow from Official Knime Page

The template we've created is versatile enough to be adapted for other retailers. However, it’s important to note that since each website has a unique design, some adjustments will be necessary to ensure optimal functionality. As web design updates occur, the code may require modifications to maintain accuracy and effectiveness.

Getting Started with the Workflow

To kick off our scraping process, we will use the Webpage Retriever and Xpath nodes in Knime to connect to the H&M Germany website. The Webpage Retriever is instrumental in fetching the HTML content of the site, while the Xpath nodes facilitate targeted extraction of specific data elements, such as product categories. This initial step sets the foundation for gathering crucial information that will be analyzed later.




After we retrieve the data, the next phase involves transforming it to extract price information and format the dataset for our reporting needs. This includes cleaning the data, filtering out irrelevant entries, and ensuring that it meets our quality standards.



Analyzing Price Distribution

One of the critical aspects of this project is calculating the product count at each price level per category. By doing so, we can analyze how prices are distributed across various categories, providing insights into market trends and pricing strategies. Understanding this distribution helps businesses identify competitive pricing strategies, product placement, and potential market gaps.

Once we complete the necessary transformations, the data will be ready to be sent to Power BI for advanced reporting and visualization. Power BI’s robust features allow us to create dynamic dashboards that highlight key performance indicators and other essential metrics, empowering stakeholders to make informed decisions.


Efficient Workflow Management

To streamline our process, we leverage metanodes within Knime. Metanodes allow us to encapsulate multiple steps into a single unit, enabling us to execute the entire workflow with just one click. This feature not only enhances efficiency but also simplifies the workflow, making it accessible even for users who may not be as experienced with Knime.

Additionally, we provide options to export the results to Excel, allowing users to manipulate and analyze the data further in a familiar format. This flexibility ensures that our users can utilize the insights generated in whatever manner suits their needs best.




If you liked this project, please don't forget to share and leave a comment below!


Share:

One Click to get Weather Forecast of your favorite cities

In this post, we will explore a powerful Knime Workflow that retrieves weather forecast information for cities of your choice. Weather data is invaluable for a variety of applications, from travel planning to agriculture, and our workflow makes it easier than ever to access and analyze this data efficiently.




(This workflow is available for download at BI-FI Business Projects Knime Hub page.)

Click here to visit the download page

Step 1: Connecting to the Weather Forecast Website

The first step in our workflow is to connect to a reputable weather forecast website where we can extract the information we need. For this project, we’ll utilize the site located at weather-forecast.com, which provides comprehensive weather data for locations worldwide.

To establish this connection, we will employ Webpage Retriever and XPath nodes within Knime. The Webpage Retriever will allow us to fetch the HTML content from the website, while the XPath nodes enable us to extract specific data elements, such as temperature, humidity, wind speed, and precipitation forecasts. This precise extraction is crucial for ensuring that we gather relevant and useful data for our analysis.



Step 2: User Interaction with City Selection

Next, we move to the STEP 1 Component of our workflow, where we introduce an interactive feature for users to select the cities they wish to monitor. This is accomplished using the Nominal Row Filter Widget, which presents a comprehensive list of all major cities from around the globe.

The ability to customize city selection enhances user experience, making it straightforward for anyone to retrieve weather forecasts for their specific locations of interest. Users can simply scroll through the list or utilize search functionality to quickly find their desired city. Once they have selected the cities, they can proceed to the next stage of the workflow.



Step 3: Data Transformation and Dashboard Integration

After the user has made their selections, the workflow proceeds to STEP 3, where we perform the necessary data transformations. This metanode is responsible for cleaning and structuring the data into a usable format. We ensure that all extracted data is consistent and well-organized, allowing for accurate representation in subsequent visualizations.

Once the transformation is complete, the final dataset is fed into a dashboard designed to display the weather forecasts for the selected cities. The dashboard serves as a visual representation of the weather data, allowing users to easily interpret the information and make informed decisions based on the forecasts.




Effortless Execution

To run the workflow, users simply need to click the Execute All button within Knime. This action will trigger the entire workflow, automating the process of data retrieval, transformation, and visualization. The seamless execution of the workflow demonstrates Knime's capability in handling complex data processing tasks with ease.

Explore More Workflows

For those interested in further expanding their data analytics capabilities, we encourage you to explore more workflows like this one. Check out the Knime section on our website for a variety of projects that can enhance your data analysis skills. Additionally, you can visit our BI-FI Business Knime Hub Profile to discover even more resources and tools tailored to your needs.

In conclusion, this Knime Workflow not only simplifies the process of accessing weather forecast data but also empowers users to make data-driven decisions. We invite you to download the workflow, explore its functionalities, and share your feedback with us. Your input is invaluable in helping us improve and develop more useful tools for data analysis!

Share:

Knime Date Difference Calculator

   In this post, we will take a look at the Knime Date Difference Calculator created by BI-FI Blogs and explore how to use it. 

   This project is useful for calculating due dates, managing shipment dates or simply calculating the date difference to see : 

  • How many days and working days has passed from Start Date
  • How many weeks has passed from Start Date
  • How many days and working days left till End Date
  • How many weeks left till End Date


   This workflow contains; flow variable usage, date-time extraction nodes and many advanced syntax for date calculations. You can copy these node configurations for your own projects as well.

   You can download the workflow via link below. All the workflows created by BI-FI Blogs will be available on Knime Hub as well.  

Click to Check the Knime Date Difference Calculator















Workflow should look like this. Now let's break it down to see how it works.



First, execute and view the STEP 1 and 

enter the Start Date and End Date for the calculations.




Now,  just run the STEP 2 metanode. This metanode creates a date range with your Start-End Date and calculates the duration for the calculations.








   After that, STEP 3.1 calculates how many days, work days and weeks has passed. To calculate the work days, first we extract the day number of the dates and if date number is greater than 5 (Friday), that day will labeled as weekend. 







For the STEP 3.2, to calculate how many days left till the End Date, we need to know which date is the execution date. We are using nodes on the right to find the execution date and pass it as a flow variable to use in calculations.

   Finally, we can run the dashboard node to see all the calculations at once as tables below. 

   We have explained the workflow details, but only manuel job is entering the dates just once. After that you can run the workflow and it will create the tables for you.  



If you liked this project, please don't forget to share and leave a comment below! 



Share:

Twitter Scraper Project: Retrieve Latest Tweets with Knime


In this post, we will delve into the exciting world of social media analytics by creating a Knime Workflow that retrieves the latest tweets from Twitter containing specific keywords. As businesses and individuals increasingly turn to Twitter for real-time updates, sentiment analysis, and engagement tracking, having a tool to gather relevant tweets is invaluable.

Overview of the Project

Our project aims to develop a user-friendly workflow that can automatically pull in tweets based on chosen keywords. The result will be a neatly organized table that displays not only the tweets themselves but also important metrics such as the user's name, follower count, and retweet count. This information can be incredibly useful for marketers, researchers, and anyone looking to gain insights from social media conversations.

Prerequisites: Setting Up Your Twitter Developer Account

Before we can start scraping tweets, you'll need to have a Twitter Developer Account. This is a straightforward process and is free of charge. Simply follow the link below to apply for your account:
Apply for Twitter Developer Account

Once your account is set up, you will receive your unique API keys, which are essential for authenticating your requests to the Twitter API. These keys will enable you to interact with Twitter’s data securely and access the tweets you’re interested in.

Download the Workflow Created by BI-FI Blogs

This workflow serves as a robust template that you can customize to suit your specific needs. It has been designed with ease of use in mind, so even those who are new to Knime will find it accessible


Connecting to Twitter's API

Once you have downloaded the workflow, the first thing you should do is to open the Twitter API Connector node within Knime. Here, you’ll need to enter your Twitter developer account details, including your API keys. This step is crucial, as it establishes a secure connection between your workflow and Twitter's API.


Setting Up Your Search Keywords

Next, you will need to specify the keywords for your search. In the Twitter API Connector, simply input the keywords that you want to track. After entering your desired keywords, click on the “Click Apply as New Default” button. This action saves your settings and prepares the workflow to fetch tweets that match your criteria.

Executing the Workflow

With everything set up, you can now run the rest of the workflow. Just hit the Execute All button, and watch as Knime performs its magic! The workflow will automatically connect to the Twitter API, search for the latest tweets containing your specified keywords, and compile the results into a comprehensive table.

Exporting Your Results

Once the tweets have been gathered, you can easily export the results to an Excel file for further analysis or reporting. This is accomplished using Step 5 of the workflow, which streamlines the process of data exportation. Having your data in Excel format allows for additional manipulation and presentation options.



Share Your Feedback

If you found this project helpful, please don't forget to share it with your network! We welcome your feedback and encourage you to leave a comment below. Your insights not only help us improve our workflows but also assist others who are exploring the capabilities of data scraping and analysis using Knime.


Conclusion

In conclusion, the Twitter Scraper Project empowers users to tap into the wealth of information available on Twitter efficiently. By following the steps outlined in this post, you can create a valuable tool for social media analysis that helps you stay informed about trends and conversations relevant to your interests or business goals. Download the workflow today and start uncovering insights from Twitter!

Share:

How To Read and Automate any RSS with Knime

In this post, we will learn how to read an RSS feed and automate this process using Knime. RSS feeds are a powerful tool for keeping up with the latest updates on your favorite websites. They allow you to receive information in a standardized format, making it easier to aggregate content and stay informed.

What is RSS?

Before we dive into the technical aspects, let's take a moment to understand what RSS actually means. RSS stands for Really Simple Syndication. It is a web feed that allows users and applications to access updates to websites in a standardized, computer-readable format.

By leveraging RSS feeds, you can stay updated on new content from blogs, news sites, and other online platforms without having to visit each site manually. This is particularly useful for content creators and marketers who want to track updates from multiple sources efficiently.

Our Example: BI-FI Blogs RSS Feed

In our example, we will utilize the RSS feed of our blog. By following this guide, you will be able to view our blog posts along with their links and creation dates every time you run the workflow. This way, you won’t miss any of the valuable content we publish!

Step 1: Getting the RSS Feed URL

The first step in automating the reading of an RSS feed is to obtain the URL of the feed itself. In our case, you can click on the RSS icon on our blog, which will take you to a new tab displaying the feed. The URL for our blog's RSS feed is as follows:

https://bifiblogs.blogspot.com/feeds/posts/default

Make sure to copy this URL, as we will need it in the next steps.






Then download the Knime workflow created by BI-FI Blogs from the link below or drag it to your Knime Workflow Editor.

Click to Download the Knime Workflow




After you open the workflow, it should look like the image above. 

Step 3: Configuring the Workflow

If you want to use a different RSS feed, you can easily replace the URL in the Table Creator node. Simply paste the new RSS feed URL and run the workflow. This flexibility allows you to adapt the workflow to your specific needs, making it a versatile tool for tracking various feeds.

We will continue with the RSS feed of our blog from the first step. 

Step 4: Running the Workflow

Now that everything is set up, all you have to do is run the entire workflow. Once executed, the workflow will process the RSS feed and display the latest blog posts in a table view. You will be able to see all the recent blogs along with their links and publication dates, as illustrated in the image below.



Benefits of Automating RSS Feeds

Automating the process of reading RSS feeds has several benefits:

  • Time-Saving: You no longer need to manually check multiple websites for updates. The workflow will automatically fetch and display the latest content for you.
  • Real-Time Updates: With every run of the workflow, you receive the most current information, ensuring you never miss out on important posts.
  • Customization: The ability to easily change the RSS feed URL allows you to tailor the workflow to your interests, whether that be specific blogs, news sites, or other content providers.

Conclusion

If you enjoyed this post and found it helpful, please don't forget to share it with your colleagues and friends. Your feedback is invaluable to us, and we encourage you to leave a comment below!

By following these steps, you now have a powerful tool at your disposal to read and automate any RSS feed using Knime. This can significantly enhance your content consumption strategy and keep you updated on the topics you care about most. Happy scraping!



Share:

Create Excel Table of Contents with Shortcut Links

When you're working with multiple sheets of data in Excel, it can be easy to feel overwhelmed and lost among all the different tabs. Navigating between various sheets manually can be time-consuming and cumbersome. Fortunately, with a little VBA code, you can create a Table of Contents page with links to each of your sheets. This handy feature allows you to quickly navigate between sheets, enhancing your workflow and productivity.

Why Use a Table of Contents in Excel?

A Table of Contents is particularly useful when you have a large workbook with numerous sheets. It provides a centralized overview, enabling you to access your data more efficiently. By simply clicking on a link in your Table of Contents, you can jump to the specific sheet you need without scrolling through tabs. This feature is invaluable for reports, dashboards, or any extensive data analysis projects.

The VBA Code

To set up your Table of Contents, you can use the following VBA code. Let’s break it down step by step to understand how it works:


Sub TableofContent
    Dim i As Long
    On Error Resume Next
    Application.DisplayAlerts = False
    Worksheets("Table of Content").Delete
    Application.DisplayAlerts = True
    On Error GoTo 0
   
    ThisWorkbook.Sheets.Add before:=ThisWorkbook.Worksheets(1)
    ActiveSheet.Name = "Table of Content"
    For i = 1 To Sheets.Count
        With ActiveSheet
        .Hyperlinks.Add _
        Anchor:=ActiveSheet.Cells(i, 1), _
        Address:="", _
        SubAddress:="'" & Sheets(i).Name & "'!A1", _
        ScreenTip:=Sheets(i).Name, _
        TextToDisplay:=Sheets(i).Name
    End With
    Next i
End Sub

How the Code Works

  1. Defining the Subprocedure: The code begins by naming the subprocedure TableofContent and defining a variable i to be used in the loop.

  2. Deleting Previous Table of Contents: If a Table of Contents page already exists, the code deletes it to create a new one. This ensures that you have an up-to-date list of all your sheets.

  3. Creating the New Table of Contents Sheet: A new sheet named "Table of Content" is added before the first worksheet in your workbook.

  4. The Loop: The real magic happens in the For loop, where the code loops through each sheet in the workbook. For each sheet, a hyperlink is created in the Table of Contents. Each link directs you to cell A1 of the corresponding sheet.

Adding Navigation Back to the Table of Contents

Once you have created the Table of Contents, it’s beneficial to have a quick way to navigate back to it after clicking on any other sheet. You can achieve this by creating a simple macro as shown below:

Sub GotoMainPage()
    Sheets("Table of Content").Select
    Sheets("Table of Content").Range("A1").Select
End Sub

Assigning the Macro to a Shape

After pasting the GotoMainPage code into a module, you can assign this macro to a shape of your choice in Excel. This allows you to quickly return to your main Table of Contents page with just a click, improving your navigation experience.



Important Note on Saving

Please keep in mind that once you have applied these macros, Excel will prompt you to save your file as an "XLSM" (macro-enabled) file. This is necessary to preserve the functionality of the macros. Don’t worry; there will be no changes to your existing data; it will simply be saved as a macro-enabled version.




Conclusion

Creating a Table of Contents with shortcut links in Excel can significantly improve your efficiency when dealing with large datasets across multiple sheets. This VBA solution provides a user-friendly way to navigate your workbook, making your data management process smoother and more organized.

If you liked this project, please don’t forget to share it with your colleagues and leave a comment below! Your feedback is invaluable, and we appreciate hearing from you!



Share:

Paste URLs as Images in Excel with VBA

 In this blog, we will create an Excel VBA code that will turn any image URL into actual images in Excel. 

Here is the full code : We will break down the code to undestand how it works.


Sub URLPictureInsert

    Dim rng As Range
    Dim cell As Range
    Dim Filename As String
    Dim theShape As Shape
    Dim xRg As Range
    Dim xCol As Long, k As Long
    On Error Resume Next
    Application.ScreenUpdating = False
   
    Set rng = Application.InputBox("Select the cells with hyperlinks.", , , , , , , 8)
   k = Application.InputBox("What should be the column difference between links and
Result column? Example: Links in Column C (3.column), result column E (5.column) then
you should type 2 here. ", , , , , , , 1)

    For Each cell In rng
            Filename = cell
                If InStr(UCase(Filename), "JPG") > 0 Or
                   InStr(UCase(Filename), "PNG") > 0 Or
                       InStr(UCase(Filename), "JPEG") > 0 Then
                        ActiveSheet.Pictures.Insert(Filename).Select
                        Set theShape = Selection.ShapeRange.item(1)

        If theShape Is Nothing Then GoTo isnill
             xCol = cell.Column + k
            Set xRg = Cells(cell.Row, xCol)
                With theShape
                    .LockAspectRatio = msoTrue
                    .Width = 200
                    .Height = 200
                    .Top = xRg.Top
                    .Left = xRg.Left
                End With
            xRg.RowHeight = 210
            xRg.ColumnWidth = 40
isnill:
            Set theShape = Nothing
           
        End If
    Next

    Application.ScreenUpdating = True
    MsgBox ("Images pasted successfully.")
End Sub

   First of all, we named the macro as URLPictureInsert  and defined the necessary variables, using Dim keyword. 

   Then we created a range variable called rng that gets the value from the selection of the user. So if user selects the cells with URLs in it, we will be able to set our range variable with that selection.

   Then we created a variable called k to let user decide which cell are we going to paste the images. 

   After that, we have created a For loop that checks whether the URLs contains JPEG,JPG or PNG in the string.

   If nothing found, the loop will not paste anything for that URL. However, if the condition above is met, then we are going to paste that image to the user specified column and the active row.

   We can set some of the features of the image, like width, height, AspectRatio etc. depending on the needs. We can also set the column and row width and height of the cells that contains images. 

   After going through each URL from the user selection, loop will be finished with a success message. 

Note: This code will work on URLs with JPEG, JPG,PNG  URLs. However, you can add other conditions to the if the statement


   Now you can use this code snippet on any of your files. Just copy this code and paste it on the file you want.

   Also please keep in mind that, once you have applied these macros, Excel will ask you to save your file as "XLSM" ( macro enabled). There will be no change in your file and data in it. It will be just macro enabled version of your file.









If you liked this project, please don't forget to share and leave a comment below.


 

Share:

Recent Posts