Treasure Data

Welcome to our feedback forum. Let us know how we can improve your experience with our product.

I think Treasure Data should...

(thinking…)

Enter your idea and we'll search to see if someone has already suggested it.

If a similar idea already exists, you can support and comment on it.

If it doesn't exist, you can post your idea so others can support it.

Enter your idea and we'll search to see if someone has already suggested it.

  1. DataConnector for google spreadsheet data ingestion

    DataConnector support for google spreadsheet to get data would be useful.
    We often use google spreadsheet for manage master data & want to import data by using replace mode everyday.

    10 votes
    Sign in
    Check!
    (thinking…)
    Reset
    or sign in with
    • facebook
    • google
      Password icon
      I agree to the terms of service
      Signed in as (Sign out)

      We’ll send you updates on this idea

      under review  ·  1 comment  ·  Inputs  ·  Admin →
    • [DataConnector] Override yml file parameter by specifying option when run command issued.

      Override yml file parameter by specifying option when run command issued.
      Command option flexibility helps integration with current system.

      5 votes
      Sign in
      Check!
      (thinking…)
      Reset
      or sign in with
      • facebook
      • google
        Password icon
        I agree to the terms of service
        Signed in as (Sign out)

        We’ll send you updates on this idea

        under review  ·  1 comment  ·  Inputs  ·  Admin →
      • [Dataconnector] Support default behaviour when --time-column is not specified (0 or scheduled time)

        Insert time_schedule or 0 on time column when --time-column is not specified is useful.

        4 votes
        Sign in
        Check!
        (thinking…)
        Reset
        or sign in with
        • facebook
        • google
          Password icon
          I agree to the terms of service
          Signed in as (Sign out)

          We’ll send you updates on this idea

          started  ·  1 comment  ·  Inputs  ·  Admin →
        • 3 votes
          Sign in
          Check!
          (thinking…)
          Reset
          or sign in with
          • facebook
          • google
            Password icon
            I agree to the terms of service
            Signed in as (Sign out)

            We’ll send you updates on this idea

            0 comments  ·  Inputs  ·  Admin →
          • Import Multiple Tables using Data Connectors

            It's painful to have a seed and load YML file combination for each table from the data source. There should be a way to import more than one table with just one seed/load file.

            2 votes
            Sign in
            Check!
            (thinking…)
            Reset
            or sign in with
            • facebook
            • google
              Password icon
              I agree to the terms of service
              Signed in as (Sign out)

              We’ll send you updates on this idea

              0 comments  ·  Inputs  ·  Admin →
            • Host JS SDK's endpoint by own

              Provide JS SDK's Endpoint.
              We want to add additional processing like filtering for JS SDK's data.
              We want to send data to own endpoint from JS SDK.
              So, is there way to provide the endpoint as docker or any images or oss?

              2 votes
              Sign in
              Check!
              (thinking…)
              Reset
              or sign in with
              • facebook
              • google
                Password icon
                I agree to the terms of service
                Signed in as (Sign out)

                We’ll send you updates on this idea

                0 comments  ·  Inputs  ·  Admin →
              • 2 votes
                Sign in
                Check!
                (thinking…)
                Reset
                or sign in with
                • facebook
                • google
                  Password icon
                  I agree to the terms of service
                  Signed in as (Sign out)

                  We’ll send you updates on this idea

                  0 comments  ·  Inputs  ·  Admin →
                • Twilio Integration

                  Collect data from Twilio. Important for Logistics' customers.

                  2 votes
                  Sign in
                  Check!
                  (thinking…)
                  Reset
                  or sign in with
                  • facebook
                  • google
                    Password icon
                    I agree to the terms of service
                    Signed in as (Sign out)

                    We’ll send you updates on this idea

                    0 comments  ·  Inputs  ·  Admin →
                  • Import data from Oracle RDB with Data Connector

                    Many of treasure data users are also using Oracle RDB. It will be more useful that Data Connector directlly got the data from Oracle.

                    2 votes
                    Sign in
                    Check!
                    (thinking…)
                    Reset
                    or sign in with
                    • facebook
                    • google
                      Password icon
                      I agree to the terms of service
                      Signed in as (Sign out)

                      We’ll send you updates on this idea

                      0 comments  ·  Inputs  ·  Admin →
                    • Append columns during bulk insert (with 'td')

                      I'd like to be able to easily add columns to a CSV when I import it using the 'td' command line client. Specifically, I'm looking to append audit columns about the load: The source CSV file name, the Load Time, the User who initiated the load, and that sort of thing.

                      These are very useful later to help minimize or discover duplicates in the data, and to parse when and how certain rows got there.

                      Since these meta-data columns are dependent on when and how the data gets loaded rather than on the source data itself, it doesn't make a…

                      1 vote
                      Sign in
                      Check!
                      (thinking…)
                      Reset
                      or sign in with
                      • facebook
                      • google
                        Password icon
                        I agree to the terms of service
                        Signed in as (Sign out)

                        We’ll send you updates on this idea

                        0 comments  ·  Inputs  ·  Admin →
                      • Support embulk filter plugins for DataConnector

                        I wish you to support these plugins.

                        embulk-filter-column (0.4.0)
                        embulk-filter-query_string (0.1.2)
                        embulk-filter-row (0.2.0)
                        embulk-filter-split_column (0.1.0)
                        embulk-input-s3 (0.2.8)
                        embulk-output-td (0.3.2)
                        embulk-parser-apache-custom-log (0.4.0)
                        embulk-parser-none (0.2.0)
                        embulk-parser-query_string (0.3.1)
                        embulk-parser-regex (0.2.1)

                        because our log data(apache combined log format) at S3 contains unnecessary data (such as access to static files).

                        1 vote
                        Sign in
                        Check!
                        (thinking…)
                        Reset
                        or sign in with
                        • facebook
                        • google
                          Password icon
                          I agree to the terms of service
                          Signed in as (Sign out)

                          We’ll send you updates on this idea

                          2 comments  ·  Inputs  ·  Admin →
                        • Improve the export method: "replace" functionality to retain schema of table being replaced

                          Currently, the "replace" mode of exporting query results creates a new table based off of the schema of said query results, then replaces whatever existent target table exists.

                          This is an issue when dealing with exports into tables with custom schema (dates, booleans, etc) -- the table is replaced with the generic "best guess" schema using query results.

                          The "append" and "truncate" methods create a staging table based off of the schema of the target table (and then insert from that table), allowing for schema retention. Can we apply that functionality to the "replace" method as well?

                          1 vote
                          Sign in
                          Check!
                          (thinking…)
                          Reset
                          or sign in with
                          • facebook
                          • google
                            Password icon
                            I agree to the terms of service
                            Signed in as (Sign out)

                            We’ll send you updates on this idea

                            0 comments  ·  Inputs  ·  Admin →
                          • 1 vote
                            Sign in
                            Check!
                            (thinking…)
                            Reset
                            or sign in with
                            • facebook
                            • google
                              Password icon
                              I agree to the terms of service
                              Signed in as (Sign out)

                              We’ll send you updates on this idea

                              0 comments  ·  Inputs  ·  Admin →
                            • View near-realtime latest rows added for debugging

                              When debugging, it would be nice to see (near) realtime query of the latest data stored in Treasure Data for a table without having to query it every time.

                              1 vote
                              Sign in
                              Check!
                              (thinking…)
                              Reset
                              or sign in with
                              • facebook
                              • google
                                Password icon
                                I agree to the terms of service
                                Signed in as (Sign out)

                                We’ll send you updates on this idea

                                0 comments  ·  Inputs  ·  Admin →
                              • Automatic uuid generation

                                We want a mechanism that could insert a uuid with every row that gets inserted.

                                PostgreSQL could support the following sentences:

                                ```
                                pageview_id uuid primary key default uuid_generate_v4()
                                ```

                                Use case:

                                We pull back a bunch of rows and do some processing on them.

                                Later, We want to go back and reference the original rows with some other Treasure Data query. We can, of course, match a bunch of columns until we're assured of uniqueness, but it would simply be a lot easier if we could reference a unique id.

                                vice versa:

                                We might run a query that returns a…

                                1 vote
                                Sign in
                                Check!
                                (thinking…)
                                Reset
                                or sign in with
                                • facebook
                                • google
                                  Password icon
                                  I agree to the terms of service
                                  Signed in as (Sign out)

                                  We’ll send you updates on this idea

                                  0 comments  ·  Inputs  ·  Admin →
                                • Support import files compressed with zlib

                                  Support multiple compression format should be considered.

                                  1 vote
                                  Sign in
                                  Check!
                                  (thinking…)
                                  Reset
                                  or sign in with
                                  • facebook
                                  • google
                                    Password icon
                                    I agree to the terms of service
                                    Signed in as (Sign out)

                                    We’ll send you updates on this idea

                                    1 comment  ·  Inputs  ·  Admin →
                                  • Don't see your idea?

                                  Treasure Data

                                  Feedback and Knowledge Base